id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
yzhuang/autotree_automl_heloc_gosdt_l512_d3_sd2 | 2023-08-26T03:57:17.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1498631206
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_heloc_gosdt_l512_d3_sd1 | 2023-08-26T04:01:54.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1471483972
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qbo-odp/anns | 2023-08-29T01:55:29.000Z | [
"region:us"
] | qbo-odp | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_codellama__CodeLlama-7b-hf | 2023-08-27T12:43:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-7b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T04:20:17.128606](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-hf/blob/main/results_2023-08-26T04%3A20%3A17.128606.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3145646168036776,\n\
\ \"acc_stderr\": 0.03351147029279506,\n \"acc_norm\": 0.31774475533378493,\n\
\ \"acc_norm_stderr\": 0.033513816423782074,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3782167557307672,\n\
\ \"mc2_stderr\": 0.014267357003813852\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3660409556313993,\n \"acc_stderr\": 0.014077223108470144,\n\
\ \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4536944831706831,\n\
\ \"acc_stderr\": 0.00496833714413636,\n \"acc_norm\": 0.6080462059350727,\n\
\ \"acc_norm_stderr\": 0.0048718874228935866\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32075471698113206,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.32075471698113206,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.032424147574830996,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.032424147574830996\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3741935483870968,\n\
\ \"acc_stderr\": 0.027528904299845794,\n \"acc_norm\": 0.3741935483870968,\n\
\ \"acc_norm_stderr\": 0.027528904299845794\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3357798165137615,\n \"acc_stderr\": 0.02024808139675293,\n \"\
acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.02024808139675293\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.033086111132364336,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.033086111132364336\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.350210970464135,\n \"acc_stderr\": 0.031052391937584353,\n \
\ \"acc_norm\": 0.350210970464135,\n \"acc_norm_stderr\": 0.031052391937584353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.03642914578292404,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.03642914578292404\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347019,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347019\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041694,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041694\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32905982905982906,\n\
\ \"acc_stderr\": 0.030782321577688163,\n \"acc_norm\": 0.32905982905982906,\n\
\ \"acc_norm_stderr\": 0.030782321577688163\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4278416347381865,\n\
\ \"acc_stderr\": 0.017692787927803728,\n \"acc_norm\": 0.4278416347381865,\n\
\ \"acc_norm_stderr\": 0.017692787927803728\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38562091503267976,\n \"acc_stderr\": 0.02787074527829032,\n\
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.02787074527829032\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3408360128617363,\n\
\ \"acc_stderr\": 0.02692084126077616,\n \"acc_norm\": 0.3408360128617363,\n\
\ \"acc_norm_stderr\": 0.02692084126077616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n\
\ \"acc_stderr\": 0.011405443620996915,\n \"acc_norm\": 0.27509778357235987,\n\
\ \"acc_norm_stderr\": 0.011405443620996915\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.027678468642144703,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.027678468642144703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529916,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529916\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3782167557307672,\n\
\ \"mc2_stderr\": 0.014267357003813852\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:20:17.128606.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:20:17.128606.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:20:17.128606.parquet'
- config_name: results
data_files:
- split: 2023_08_26T04_20_17.128606
path:
- results_2023-08-26T04:20:17.128606.parquet
- split: latest
path:
- results_2023-08-26T04:20:17.128606.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-7b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T04:20:17.128606](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-hf/blob/main/results_2023-08-26T04%3A20%3A17.128606.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3145646168036776,
"acc_stderr": 0.03351147029279506,
"acc_norm": 0.31774475533378493,
"acc_norm_stderr": 0.033513816423782074,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3782167557307672,
"mc2_stderr": 0.014267357003813852
},
"harness|arc:challenge|25": {
"acc": 0.3660409556313993,
"acc_stderr": 0.014077223108470144,
"acc_norm": 0.3993174061433447,
"acc_norm_stderr": 0.014312094557946702
},
"harness|hellaswag|10": {
"acc": 0.4536944831706831,
"acc_stderr": 0.00496833714413636,
"acc_norm": 0.6080462059350727,
"acc_norm_stderr": 0.0048718874228935866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32075471698113206,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.32075471698113206,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.032424147574830996,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.032424147574830996
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.027528904299845794,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.027528904299845794
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.02024808139675293,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.02024808139675293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.033086111132364336,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.033086111132364336
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.350210970464135,
"acc_stderr": 0.031052391937584353,
"acc_norm": 0.350210970464135,
"acc_norm_stderr": 0.031052391937584353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.03642914578292404,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.03642914578292404
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347019,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347019
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041694,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041694
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32905982905982906,
"acc_stderr": 0.030782321577688163,
"acc_norm": 0.32905982905982906,
"acc_norm_stderr": 0.030782321577688163
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4278416347381865,
"acc_stderr": 0.017692787927803728,
"acc_norm": 0.4278416347381865,
"acc_norm_stderr": 0.017692787927803728
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.02787074527829032,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.02787074527829032
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3408360128617363,
"acc_stderr": 0.02692084126077616,
"acc_norm": 0.3408360128617363,
"acc_norm_stderr": 0.02692084126077616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27509778357235987,
"acc_stderr": 0.011405443620996915,
"acc_norm": 0.27509778357235987,
"acc_norm_stderr": 0.011405443620996915
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.027678468642144703,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.027678468642144703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529916,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529916
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3782167557307672,
"mc2_stderr": 0.014267357003813852
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
shaojiang/wenyanwen | 2023-08-26T04:39:17.000Z | [
"region:us"
] | shaojiang | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf | 2023-08-27T12:43:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/CodeLlama-34b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/CodeLlama-34b-hf](https://huggingface.co/NousResearch/CodeLlama-34b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T04:48:17.440962](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf/blob/main/results_2023-08-26T04%3A48%3A17.440962.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36985893005920306,\n\
\ \"acc_stderr\": 0.03480324288558035,\n \"acc_norm\": 0.3711436021174746,\n\
\ \"acc_norm_stderr\": 0.034811802518316066,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3889056985901338,\n\
\ \"mc2_stderr\": 0.014062623546242098\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3361774744027304,\n \"acc_stderr\": 0.013804855026205758,\n\
\ \"acc_norm\": 0.37542662116040953,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2818163712407887,\n\
\ \"acc_stderr\": 0.004489648865080891,\n \"acc_norm\": 0.31836287592113127,\n\
\ \"acc_norm_stderr\": 0.00464889078758169\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179326,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179326\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357797,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357797\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535892,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535892\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.43870967741935485,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.43870967741935485,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.47474747474747475,\n \"acc_stderr\": 0.03557806245087314,\n \"\
acc_norm\": 0.47474747474747475,\n \"acc_norm_stderr\": 0.03557806245087314\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442205,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442205\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187215,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187215\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.03156663099215416,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4073394495412844,\n \"acc_stderr\": 0.021065986244412884,\n \"\
acc_norm\": 0.4073394495412844,\n \"acc_norm_stderr\": 0.021065986244412884\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842572,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842572\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5341880341880342,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.5341880341880342,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4929757343550447,\n\
\ \"acc_stderr\": 0.017878199003432214,\n \"acc_norm\": 0.4929757343550447,\n\
\ \"acc_norm_stderr\": 0.017878199003432214\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.025992472029306386,\n\
\ \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.025992472029306386\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2223463687150838,\n\
\ \"acc_stderr\": 0.013907189208156881,\n \"acc_norm\": 0.2223463687150838,\n\
\ \"acc_norm_stderr\": 0.013907189208156881\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.02736359328468494,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.02736359328468494\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4855305466237942,\n\
\ \"acc_stderr\": 0.028386198084177687,\n \"acc_norm\": 0.4855305466237942,\n\
\ \"acc_norm_stderr\": 0.028386198084177687\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.027125115513166858,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.027125115513166858\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140245,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140245\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.011311347690633898,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.011311347690633898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2957516339869281,\n \"acc_stderr\": 0.018463154132632817,\n \
\ \"acc_norm\": 0.2957516339869281,\n \"acc_norm_stderr\": 0.018463154132632817\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.373134328358209,\n\
\ \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.373134328358209,\n\
\ \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3889056985901338,\n\
\ \"mc2_stderr\": 0.014062623546242098\n }\n}\n```"
repo_url: https://huggingface.co/NousResearch/CodeLlama-34b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:48:17.440962.parquet'
- config_name: results
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- results_2023-08-26T04:48:17.440962.parquet
- split: latest
path:
- results_2023-08-26T04:48:17.440962.parquet
---
# Dataset Card for Evaluation run of NousResearch/CodeLlama-34b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/CodeLlama-34b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/CodeLlama-34b-hf](https://huggingface.co/NousResearch/CodeLlama-34b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T04:48:17.440962](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf/blob/main/results_2023-08-26T04%3A48%3A17.440962.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36985893005920306,
"acc_stderr": 0.03480324288558035,
"acc_norm": 0.3711436021174746,
"acc_norm_stderr": 0.034811802518316066,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3889056985901338,
"mc2_stderr": 0.014062623546242098
},
"harness|arc:challenge|25": {
"acc": 0.3361774744027304,
"acc_stderr": 0.013804855026205758,
"acc_norm": 0.37542662116040953,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.2818163712407887,
"acc_stderr": 0.004489648865080891,
"acc_norm": 0.31836287592113127,
"acc_norm_stderr": 0.00464889078758169
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179326,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179326
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357797,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357797
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535892,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535892
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43870967741935485,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.43870967741935485,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.47474747474747475,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.47474747474747475,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442205,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442205
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187215,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187215
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4073394495412844,
"acc_stderr": 0.021065986244412884,
"acc_norm": 0.4073394495412844,
"acc_norm_stderr": 0.021065986244412884
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842572,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842572
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5341880341880342,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.5341880341880342,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4929757343550447,
"acc_stderr": 0.017878199003432214,
"acc_norm": 0.4929757343550447,
"acc_norm_stderr": 0.017878199003432214
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2223463687150838,
"acc_stderr": 0.013907189208156881,
"acc_norm": 0.2223463687150838,
"acc_norm_stderr": 0.013907189208156881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.02736359328468494,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.02736359328468494
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4855305466237942,
"acc_stderr": 0.028386198084177687,
"acc_norm": 0.4855305466237942,
"acc_norm_stderr": 0.028386198084177687
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.027125115513166858,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.027125115513166858
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140245,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140245
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.011311347690633898,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.011311347690633898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2957516339869281,
"acc_stderr": 0.018463154132632817,
"acc_norm": 0.2957516339869281,
"acc_norm_stderr": 0.018463154132632817
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.373134328358209,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.373134328358209,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.03722965741385539,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.03722965741385539
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3889056985901338,
"mc2_stderr": 0.014062623546242098
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Stardos/Asmodeus | 2023-08-26T05:08:40.000Z | [
"region:us"
] | Stardos | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_heloc_gosdt_l512_d3_sd3 | 2023-08-26T05:16:49.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1508311822
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_acrastt__OmegLLaMA-3B | 2023-08-27T12:43:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of acrastt/OmegLLaMA-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/OmegLLaMA-3B](https://huggingface.co/acrastt/OmegLLaMA-3B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__OmegLLaMA-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T05:16:42.253337](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B/blob/main/results_2023-08-26T05%3A16%3A42.253337.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2849233795441898,\n\
\ \"acc_stderr\": 0.03260814040755716,\n \"acc_norm\": 0.28854734152664363,\n\
\ \"acc_norm_stderr\": 0.032609122609257954,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123904,\n \"mc2\": 0.3331382494059585,\n\
\ \"mc2_stderr\": 0.01330395634747459\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3583617747440273,\n \"acc_stderr\": 0.014012883334859859,\n\
\ \"acc_norm\": 0.4035836177474403,\n \"acc_norm_stderr\": 0.014337158914268441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4927305317665804,\n\
\ \"acc_stderr\": 0.0049892540118957615,\n \"acc_norm\": 0.6613224457279426,\n\
\ \"acc_norm_stderr\": 0.004722928332834051\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.03476599607516479,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.03476599607516479\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212387,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212387\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25806451612903225,\n \"acc_stderr\": 0.024892469172462853,\n \"\
acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.024892469172462853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3153846153846154,\n \"acc_stderr\": 0.02355964698318995,\n \
\ \"acc_norm\": 0.3153846153846154,\n \"acc_norm_stderr\": 0.02355964698318995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3321100917431193,\n\
\ \"acc_stderr\": 0.020192682985423344,\n \"acc_norm\": 0.3321100917431193,\n\
\ \"acc_norm_stderr\": 0.020192682985423344\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n\
\ \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.02969633871342288,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802287,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802287\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.01622501794477096,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.01622501794477096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3092485549132948,\n \"acc_stderr\": 0.02488314057007175,\n\
\ \"acc_norm\": 0.3092485549132948,\n \"acc_norm_stderr\": 0.02488314057007175\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.02505850331695815,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.02505850331695815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31511254019292606,\n\
\ \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.31511254019292606,\n\
\ \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981647,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987866,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987866\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399697,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399697\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123904,\n \"mc2\": 0.3331382494059585,\n\
\ \"mc2_stderr\": 0.01330395634747459\n }\n}\n```"
repo_url: https://huggingface.co/acrastt/OmegLLaMA-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:16:42.253337.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- results_2023-08-26T05:16:42.253337.parquet
- split: latest
path:
- results_2023-08-26T05:16:42.253337.parquet
---
# Dataset Card for Evaluation run of acrastt/OmegLLaMA-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/OmegLLaMA-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/OmegLLaMA-3B](https://huggingface.co/acrastt/OmegLLaMA-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__OmegLLaMA-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T05:16:42.253337](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B/blob/main/results_2023-08-26T05%3A16%3A42.253337.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2849233795441898,
"acc_stderr": 0.03260814040755716,
"acc_norm": 0.28854734152664363,
"acc_norm_stderr": 0.032609122609257954,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123904,
"mc2": 0.3331382494059585,
"mc2_stderr": 0.01330395634747459
},
"harness|arc:challenge|25": {
"acc": 0.3583617747440273,
"acc_stderr": 0.014012883334859859,
"acc_norm": 0.4035836177474403,
"acc_norm_stderr": 0.014337158914268441
},
"harness|hellaswag|10": {
"acc": 0.4927305317665804,
"acc_stderr": 0.0049892540118957615,
"acc_norm": 0.6613224457279426,
"acc_norm_stderr": 0.004722928332834051
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.03476599607516479,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.03476599607516479
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212387,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212387
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462853,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3153846153846154,
"acc_stderr": 0.02355964698318995,
"acc_norm": 0.3153846153846154,
"acc_norm_stderr": 0.02355964698318995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3321100917431193,
"acc_stderr": 0.020192682985423344,
"acc_norm": 0.3321100917431193,
"acc_norm_stderr": 0.020192682985423344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802287,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802287
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.01622501794477096,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.01622501794477096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3092485549132948,
"acc_stderr": 0.02488314057007175,
"acc_norm": 0.3092485549132948,
"acc_norm_stderr": 0.02488314057007175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.02505850331695815,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.02505850331695815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31511254019292606,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.31511254019292606,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981647,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987866,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987866
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399697,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399697
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123904,
"mc2": 0.3331382494059585,
"mc2_stderr": 0.01330395634747459
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yudiwbs/eli5_id-llama2-1k | 2023-08-26T06:19:14.000Z | [
"region:us"
] | yudiwbs | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 821834
num_examples: 1000
download_size: 458403
dataset_size: 821834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eli5_id-llama2-1k"
1000 data dengan format LLAMA2 yang dapat digunakan untuk finetune.
Sumber: https://huggingface.co/datasets/indonesian-nlp/eli5_id/
|
open-llm-leaderboard/details_codellama__CodeLlama-34b-hf | 2023-09-17T13:13:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-34b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-34b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:13:18.038521](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-hf/blob/main/results_2023-09-17T13-13-18.038521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.0004320097346038763,\n \"f1\": 0.049458892617449755,\n\
\ \"f1_stderr\": 0.0012523570997250966,\n \"acc\": 0.43630162765913527,\n\
\ \"acc_stderr\": 0.011053010908838208\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038763,\n\
\ \"f1\": 0.049458892617449755,\n \"f1_stderr\": 0.0012523570997250966\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \
\ \"acc_stderr\": 0.009629588445673819\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002597\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-34b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_13_18.038521
path:
- '**/details_harness|drop|3_2023-09-17T13-13-18.038521.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-13-18.038521.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_13_18.038521
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-13-18.038521.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-13-18.038521.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:33:43.008439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:33:43.008439.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:33:43.008439.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_13_18.038521
path:
- '**/details_harness|winogrande|5_2023-09-17T13-13-18.038521.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-13-18.038521.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_33_43.008439
path:
- results_2023-08-26T05:33:43.008439.parquet
- split: 2023_09_17T13_13_18.038521
path:
- results_2023-09-17T13-13-18.038521.parquet
- split: latest
path:
- results_2023-09-17T13-13-18.038521.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-34b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-34b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-34b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:13:18.038521](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-hf/blob/main/results_2023-09-17T13-13-18.038521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038763,
"f1": 0.049458892617449755,
"f1_stderr": 0.0012523570997250966,
"acc": 0.43630162765913527,
"acc_stderr": 0.011053010908838208
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038763,
"f1": 0.049458892617449755,
"f1_stderr": 0.0012523570997250966
},
"harness|gsm8k|5": {
"acc": 0.1425322213798332,
"acc_stderr": 0.009629588445673819
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002597
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1 | 2023-09-17T17:56:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Phind/Phind-CodeLlama-34B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Phind/Phind-CodeLlama-34B-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T17:56:04.803454](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1/blob/main/results_2023-09-17T17-56-04.803454.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3409186241610738,\n\
\ \"em_stderr\": 0.004854388549221253,\n \"f1\": 0.3901226929530212,\n\
\ \"f1_stderr\": 0.004753426310613145,\n \"acc\": 0.46541261736516804,\n\
\ \"acc_stderr\": 0.01182360456434163\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3409186241610738,\n \"em_stderr\": 0.004854388549221253,\n\
\ \"f1\": 0.3901226929530212,\n \"f1_stderr\": 0.004753426310613145\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2047005307050796,\n \
\ \"acc_stderr\": 0.011113916396062963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.0125332927326203\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Phind/Phind-CodeLlama-34B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|drop|3_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T17-56-04.803454.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-56-04.803454.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:41:49.471462.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:41:49.471462.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T17_56_04.803454
path:
- '**/details_harness|winogrande|5_2023-09-17T17-56-04.803454.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T17-56-04.803454.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_41_49.471462
path:
- results_2023-08-26T05:41:49.471462.parquet
- split: 2023_09_17T17_56_04.803454
path:
- results_2023-09-17T17-56-04.803454.parquet
- split: latest
path:
- results_2023-09-17T17-56-04.803454.parquet
---
# Dataset Card for Evaluation run of Phind/Phind-CodeLlama-34B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Phind/Phind-CodeLlama-34B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Phind/Phind-CodeLlama-34B-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T17:56:04.803454](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v1/blob/main/results_2023-09-17T17-56-04.803454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221253,
"f1": 0.3901226929530212,
"f1_stderr": 0.004753426310613145,
"acc": 0.46541261736516804,
"acc_stderr": 0.01182360456434163
},
"harness|drop|3": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221253,
"f1": 0.3901226929530212,
"f1_stderr": 0.004753426310613145
},
"harness|gsm8k|5": {
"acc": 0.2047005307050796,
"acc_stderr": 0.011113916396062963
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.0125332927326203
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-Python-v1 | 2023-09-17T16:02:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Phind/Phind-CodeLlama-34B-Python-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Phind/Phind-CodeLlama-34B-Python-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-Python-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-Python-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T16:02:38.595550](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-Python-v1/blob/main/results_2023-09-17T16-02-38.595550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.265625,\n \
\ \"em_stderr\": 0.004523067479107055,\n \"f1\": 0.3185192953020138,\n\
\ \"f1_stderr\": 0.004482746835839152,\n \"acc\": 0.4517772845779581,\n\
\ \"acc_stderr\": 0.012170333746109104\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.265625,\n \"em_stderr\": 0.004523067479107055,\n \
\ \"f1\": 0.3185192953020138,\n \"f1_stderr\": 0.004482746835839152\n \
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21531463229719486,\n \
\ \"acc_stderr\": 0.011322096294579654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6882399368587214,\n \"acc_stderr\": 0.013018571197638551\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Phind/Phind-CodeLlama-34B-Python-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T16_02_38.595550
path:
- '**/details_harness|drop|3_2023-09-17T16-02-38.595550.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T16-02-38.595550.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T16_02_38.595550
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-02-38.595550.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-02-38.595550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:45:26.681000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:45:26.681000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:45:26.681000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T16_02_38.595550
path:
- '**/details_harness|winogrande|5_2023-09-17T16-02-38.595550.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T16-02-38.595550.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_45_26.681000
path:
- results_2023-08-26T05:45:26.681000.parquet
- split: 2023_09_17T16_02_38.595550
path:
- results_2023-09-17T16-02-38.595550.parquet
- split: latest
path:
- results_2023-09-17T16-02-38.595550.parquet
---
# Dataset Card for Evaluation run of Phind/Phind-CodeLlama-34B-Python-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Phind/Phind-CodeLlama-34B-Python-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Phind/Phind-CodeLlama-34B-Python-v1](https://huggingface.co/Phind/Phind-CodeLlama-34B-Python-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-Python-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:02:38.595550](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-Python-v1/blob/main/results_2023-09-17T16-02-38.595550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.265625,
"em_stderr": 0.004523067479107055,
"f1": 0.3185192953020138,
"f1_stderr": 0.004482746835839152,
"acc": 0.4517772845779581,
"acc_stderr": 0.012170333746109104
},
"harness|drop|3": {
"em": 0.265625,
"em_stderr": 0.004523067479107055,
"f1": 0.3185192953020138,
"f1_stderr": 0.004482746835839152
},
"harness|gsm8k|5": {
"acc": 0.21531463229719486,
"acc_stderr": 0.011322096294579654
},
"harness|winogrande|5": {
"acc": 0.6882399368587214,
"acc_stderr": 0.013018571197638551
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AhmadZaidi/lamini_docs | 2023-08-26T06:22:20.000Z | [
"region:us"
] | AhmadZaidi | null | null | null | 0 | 0 | Entry not found |
nikhilno1/guide | 2023-08-26T06:42:14.000Z | [
"language:en",
"license:apache-2.0",
"region:us"
] | nikhilno1 | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- en
pretty_name: User Guide
--- |
DataStudio/STT_01 | 2023-08-28T06:12:31.000Z | [
"task_categories:automatic-speech-recognition",
"size_categories:10K<n<100K",
"language:vi",
"region:us"
] | DataStudio | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: content
dtype: string
splits:
- name: train
num_bytes: 19935231704.56
num_examples: 72947
download_size: 13639897038
dataset_size: 19935231704.56
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- automatic-speech-recognition
language:
- vi
pretty_name: Speech-to-Text
size_categories:
- 10K<n<100K
---
# Dataset Card for "STT_01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vilifax/bday | 2023-08-26T06:54:47.000Z | [
"region:us"
] | vilifax | null | null | null | 0 | 0 | Entry not found |
Viswa123/inpout | 2023-08-26T07:06:11.000Z | [
"region:us"
] | Viswa123 | null | null | null | 0 | 0 | Entry not found |
rwkv-x-dev/slimpajama-world-tokenized | 2023-08-26T07:15:02.000Z | [
"region:us"
] | rwkv-x-dev | null | null | null | 0 | 0 | Entry not found |
SAM69/IMAGES_CAP | 2023-08-26T07:20:39.000Z | [
"license:bsd-3-clause",
"region:us"
] | SAM69 | null | null | null | 0 | 0 | ---
license: bsd-3-clause
---
|
yzhuang/autotree_automl_covertype_gosdt_l512_d3 | 2023-08-26T07:42:36.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2013799218
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_covertype_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaiku03/custom_complain_dataset_NER9 | 2023-08-26T08:09:13.000Z | [
"region:us"
] | kaiku03 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: ner_tags
dtype: string
- name: ner_tags_numeric
sequence: int64
splits:
- name: train
num_bytes: 15980
num_examples: 56
- name: validation
num_bytes: 2232
num_examples: 8
download_size: 7184
dataset_size: 18212
---
# Dataset Card for "custom_complain_dataset_NER9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sahand7798/SA_llama2_dataset | 2023-08-26T08:24:48.000Z | [
"region:us"
] | Sahand7798 | null | null | null | 0 | 0 | Entry not found |
S2T/audio-v2 | 2023-08-26T08:45:12.000Z | [
"region:us"
] | S2T | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: raw_transcription
dtype: string
splits:
- name: train
num_bytes: 1665722954.5
num_examples: 11420
- name: test
num_bytes: 141949259.0
num_examples: 1000
download_size: 1776326491
dataset_size: 1807672213.5
---
# Dataset Card for "audio-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wuming156/sdxllora | 2023-09-04T07:19:10.000Z | [
"region:us"
] | wuming156 | null | null | null | 0 | 0 | Entry not found |
FinchResearch/wiki-webcrawl | 2023-09-03T06:35:26.000Z | [
"region:us"
] | FinchResearch | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Faradaylab__Aria-70B | 2023-08-27T12:43:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Faradaylab/Aria-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Faradaylab/Aria-70B](https://huggingface.co/Faradaylab/Aria-70B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Faradaylab__Aria-70B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T09:05:40.294272](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__Aria-70B/blob/main/results_2023-08-26T09%3A05%3A40.294272.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6386983068475005,\n\
\ \"acc_stderr\": 0.032863621226889406,\n \"acc_norm\": 0.6425916297913504,\n\
\ \"acc_norm_stderr\": 0.03283781788418258,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.527991738544026,\n\
\ \"mc2_stderr\": 0.015530613367021443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.01428589829293817,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6690898227444733,\n\
\ \"acc_stderr\": 0.004695791340502876,\n \"acc_norm\": 0.8586934873531169,\n\
\ \"acc_norm_stderr\": 0.003476255509644533\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894442,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894442\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768783,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768783\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.02362715946031867,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.02362715946031867\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.02991858670779883,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.02991858670779883\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.039800662464677665,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.039800662464677665\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371798,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371798\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.01633726869427011,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.01633726869427011\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718968,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718968\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.01275911706651802,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.01275911706651802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.527991738544026,\n\
\ \"mc2_stderr\": 0.015530613367021443\n }\n}\n```"
repo_url: https://huggingface.co/Faradaylab/Aria-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|arc:challenge|25_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hellaswag|10_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:05:40.294272.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:05:40.294272.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T09:05:40.294272.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T09:05:40.294272.parquet'
- config_name: results
data_files:
- split: 2023_08_26T09_05_40.294272
path:
- results_2023-08-26T09:05:40.294272.parquet
- split: latest
path:
- results_2023-08-26T09:05:40.294272.parquet
---
# Dataset Card for Evaluation run of Faradaylab/Aria-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Faradaylab/Aria-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Faradaylab/Aria-70B](https://huggingface.co/Faradaylab/Aria-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Faradaylab__Aria-70B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T09:05:40.294272](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__Aria-70B/blob/main/results_2023-08-26T09%3A05%3A40.294272.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6386983068475005,
"acc_stderr": 0.032863621226889406,
"acc_norm": 0.6425916297913504,
"acc_norm_stderr": 0.03283781788418258,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.527991738544026,
"mc2_stderr": 0.015530613367021443
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.01428589829293817,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094087
},
"harness|hellaswag|10": {
"acc": 0.6690898227444733,
"acc_stderr": 0.004695791340502876,
"acc_norm": 0.8586934873531169,
"acc_norm_stderr": 0.003476255509644533
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894442,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894442
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768783,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768783
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163255,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.02362715946031867,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.02362715946031867
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.02991858670779883,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.02991858670779883
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.039800662464677665,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.039800662464677665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371798,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371798
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.01633726869427011,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.01633726869427011
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718968,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718968
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.01275911706651802,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.01275911706651802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.527991738544026,
"mc2_stderr": 0.015530613367021443
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SashaPlayzz/sm | 2023-08-26T09:31:47.000Z | [
"license:openrail",
"region:us"
] | SashaPlayzz | null | null | null | 0 | 0 | ---
license: openrail
---
|
arindammajee/news | 2023-09-06T10:17:28.000Z | [
"region:us"
] | arindammajee | null | null | null | 0 | 0 | |
keratonereport/Keratone-Toenail-Fungus | 2023-08-26T09:58:35.000Z | [
"region:us"
] | keratonereport | null | null | null | 0 | 0 | <h1 style="text-align: left;">Keratone Toenail Fungus</h1>
<p><span style="font-family: georgia;"><strong>Product Name - Keratone<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Side Effects - No Side Effects (100% Natural)</strong></span></p>
<p><span style="font-family: georgia;"><strong>Main Benefits - Longer & Beautiful nails</strong></span></p>
<p><span style="font-family: georgia;"><strong>Category - Deep Nails Cleaner Liquid<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Results - In Few Weeks<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Availability - Online</strong></span></p>
<p><span style="font-family: georgia;"><strong>Customer Reviews - ★★★★✰ 4.9/5</strong></span></p>
<p><span style="font-family: georgia;"><strong>Price - Visit <a href="https://www.healthsupplement24x7.com/get-keratone">Official Website</a></strong></span></p>
<p><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/get-keratone">https://www.healthsupplement24x7.com/get-keratone</a></strong></span></p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><span style="font-family: georgia;"><strong><span style="color: red;"><span style="background-color: #ffe599;">Get Huge Discount Now!!</span></span></strong></span></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><strong><span style="font-family: georgia;"><span style="background-color: #fff2cc;"><span style="color: red;">Special Discount- As Low As On Keratone – Get Your Best Discount Online Hurry!!</span></span></span></strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgb67lDxHcmHkOvO0UEXiwQPbS0R6FRdMWuuaUsZ2InxQLmgD12-yjBosmBvsAfstafYJ9gfvP73kll57duryKAQbTwL00EGvy_z7o3HZnmGB8lZ9up4sf4Otq5dF0iENvt7ROcFUsZxHOgJYZpSy8WOJGNdyAffyt8Sy6f7QRoY7Jd6hG4IGa1qOeTDdGc/w640-h328/Keratone%20Toenail%20Fungus%201.jpg" alt="" width="640" height="328" border="0" data-original-height="575" data-original-width="1120" /></a></div>
<p>Your skin and nails need protection against pollutants and toxins. It is not enough to clean them with water every day. You must nourish them with natural ingredients like lavender essential oil, almond oil, tea tree oil, or aloe vera. However, making a blend and doing it on your own is difficult.</p>
<p>We have a product for you that can improve your nail and skin health and reduce the incidences of toenail fungus and fungal infections. <a href="https://sway.office.com/rJU13t7FZQh3HDyc?ref=Link&loc=mysways">Keratone</a> is a natural product that can help you fight severe nail fungus so that you don’t get brittle nails in the future.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><strong><span style="font-family: georgia;"><span style="background-color: #d9d2e9;"><span style="color: red;">SALE IS LIVE</span></span></span></strong></a></h2>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><strong><span style="font-family: georgia;"><span style="background-color: #ffe599;">Get <span style="color: red;">Keratone </span> “Now Available” Hurry Limited Time Offer Only For 1st User!!</span></span></strong></a></h2>
<h2 style="text-align: left;"><strong>What Is <a href="https://www.eventcreate.com/e/keratone">Keratone</a>? What Does It Do?</strong></h2>
<p><a href="https://www.podcasts.com/keratone">Keratone</a> is a special product that uses a doctor-formulated blend to maintain healthy skin and nails in individuals. If you have been suffering from poor nail and skin health, this product can help you greatly.</p>
<p>All Keratone ingredients are derived from trusted sources to help you treat nail fungus. These natural ingredients include lemongrass, clove, lavender, aloe vera, tea tree, almond, vitamin E, organic flaxseed and other essential oils.</p>
<p>The formula of <a href="https://keratone.bandcamp.com/track/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nails-spam-or-legit">Keratone</a> is rich in anti-inflammatory properties that can reduce fungal infections. This toenail fungus oil is extremely beneficial for those individuals who don’t want to take antibiotics to get rid of toenail fungus. Since it only contains natural ingredients, it promotes nail growth without causing any side effects.</p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><span style="background-color: #d9ead3;"><span style="color: red;"><span style="font-family: georgia;"><strong>LIMITED TIME OFFER</strong></span></span></span></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>Click Here to Order Keratone at Special Discounted Price</strong></span></span></span></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUn0e7o_3sSlt4CSPNKI5sUMhCFUN_GAnVLtXPR0bERglnXftQxJvBzQGR-6zTK6_Fllkr4ianfuUm-BDzdPgc0YMmiZyQzHERQtQsoSlI2g8pI5fpRwfzE59P-Sbzi1ytKbU6ybV4RGU7PZccehLniY6v6mQ4lTt9IROOxOySemGB3g6ow2-Ub596CYay/w640-h274/Keratone%20Toenail%20Fungus%202.png" alt="" width="640" height="274" border="0" data-original-height="749" data-original-width="1750" /></a></div>
<h2 style="text-align: left;"><strong>Characteristics Of <a href="https://keratone.mystrikingly.com/">Keratone</a><br /></strong></h2>
<p>The <a href="https://keratone-report.clubeo.com/page/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nails-spam-or-legit.html">Keratone</a> nail fungus eliminator is unlike anything you have ever used. This product is packed with several characteristics to provide a rewarding experience to individuals. The highlights of this nail supplement:</p>
<p>The Keratone nail health formula uses 100% natural ingredients in its formula to keep your nails and skin healthy.</p>
<p>This toenail fungus oil is a doctor-formulated blend which makes it more trustworthy.</p>
<p>Keratone targets the root cause of unhealthy nails without using GMOs, chemicals and stimulants.</p>
<p>Every bottle of <a href="https://devfolio.co/@keratone">Keratone</a> comes with an enclosed brush applicator and a cotton swab to enable you to keep your nails healthy.</p>
<p>Keratone is one of the few products that can purportedly protect nail keratin.</p>
<p>There are hundreds of Keratone reviews on the official website of the product.</p>
<p>You get a 100% money-back guarantee with every order of Keratone.</p>
<p><a href="https://www.ivoox.com/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-audios-mp3_rf_114898445_1.html">Keratone</a> is manufactured in an FDA-registered and GMP-certified facility in the US.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">MUST SEE: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-keratone">“Critical News Keratone Report – They Will Never Tell You This”</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJOmVstPi34Vjt_getBfFQIKym4Cp3i5lLExdGrEIt9NuorFGhLcWOGGSqywmTTagctbS_mCen2wHv1pHffdZaxfj3kTRdAzb1YSQ5KC9Qah-qWHDrDBKhTv_Bd2TPg9EjZD-_ScoJh9mImNh5AGWb2HBITc-BRa9u8koS3M2u1eKQdz-2UtwChOlmCJ65/w640-h240/Keratone%20Toenail%20Fungus%203.png" alt="" width="640" height="240" border="0" data-original-height="650" data-original-width="1731" /></a></div>
<h2 style="text-align: left;"><strong>How <a href="https://keratone.jimdosite.com/">Keratone</a> Works to Improve Nails and Skin?</strong></h2>
<p>Most of the existing formula doesn’t kills the fungus instead stops its action temporarily. This performance eliminates the symptoms leaving the root behind the inner layers allowing the fungus growth and survival. It also affects the body’s natural anti-fungal effects and starts increasing its resistance, making it impossible to overcome the fungal attack. Hence, the special oils in this Keratone serum work in synergy with distinct nature, which improves the fungus resistance in the body and supports healthy nails and skin.</p>
<p>Keratone application not just addresses the inner fungal attack but also erodes them from the body. This liquid serum absorbs better and triggers the immune cells to renew the damaged cells. It thus stimulates blood circulation and offers essential nutrient support to rebuild the layers. Therefore, consistent use of this serum fixes the underlying fungal infection and clears the infection's nasty signs and symptoms. It also makes users live fungus-free life without worrying about itching and foul smell.</p>
<p>Moreover, the <a href="https://devfolio.co/projects/keratone-toenail-fungus-fungusbusting-serum-usa-170b">Keratone</a> Serum offers ultimate benefits to incredible users. It helps users attain smooth skin and shiny nails, giving them a youthful feel no matter how old they are. You can dive into the below sections to understand more advantages of <a href="https://keratone.hashnode.dev/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nailsspam-or-legit">Keratone</a> serum.</p>
<h3 style="text-align: left;"><strong><span style="font-family: georgia;"><span style="background-color: #ffe599; color: red; font-size: small;"><a href="https://www.healthsupplement24x7.com/get-keratone">Find the Original Keratone Bottles Here that have Helped Thousands of People Kicking Out Fungal Infections.</a></span></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYInjbFVjpjwx7JWafXqqLS5w-YOjrrxuxY2IhSyEMHRi6kU_sALiYPEtJcyjNNtG7GBNxr9--5h5UiFU6vfD-fdEqNN859G36GivaKHeU3JXpmKFHny7wKEsNm4mQoHY128DnhWOfDE3ugC80jzlXMR9i3-Qi0rjS1ITlLvlG9q_gejI4UGBiDvJjYljH/w640-h254/Keratone%20Toenail%20Fungus%204.png" alt="" width="640" height="254" border="0" data-original-height="660" data-original-width="1663" /></a></div>
<h2 style="text-align: left;"><strong>Ingredients Blend in <a href="https://keratone.company.site/">Keratone</a><br /></strong></h2>
<p>Before indulging in any new routine, it is better to understand its formulation and check the ingredients if it suits our body. Hence, the creator reveals the full list of Keratone ingredients on the label to ensure the transparency of the formula. It helps users to undergo research before using these extracts in their routine, whether it helps them with positive impacts.</p>
<p>According to the <a href="https://bitbucket.org/keratone/keratone/issues/1/keratone-toenail-fungus-remover-fungus">Keratone</a> official website, the serum is made of all-natural ingredients, and there are no chemicals in it. This proprietary solution includes four high-quality oils with nine more minerals and extracts.</p>
<p><strong>Lavender oil:</strong> This oil helps in protecting nail keratin, thereby improving nail and skin health. The oil contains anti-fungal properties that fight against strong fungal attack and prevents infection.</p>
<p><strong>Organic Flaxseed oil:</strong> This particular oil contains properties to boost natural skin immunity and enhances skin health. It also helps deal with inflammation eroding the swelling caused by infection.</p>
<p>Almond oil is rich in anti-fungal effects that help prevent athlete’s foot, ringworm, and more. It combats fungal infections and their nasty signs on skin and nails.</p>
<p><strong>Tea Tree oil:</strong> Tea tree oil is rich in anti-fungal properties and kills the fungi that cause ringworm. It controls the growth of fungal infections and prevents nails from diseases.</p>
<p><strong>Lemongrass oil:</strong> It is high in anti-fungal agents, which prevent infections and help deal with inflammation and damage to skin and nails.</p>
<p><strong>Aloe vera:</strong> It is a natural moisturizing agent that prevents dryness and cracks on the skin. It has anti-fungal effects that soothe skin from dehydration and signs of infections.</p>
<p>Undecylenic acid is a beneficial fatty acid that helps combat fungal infections. It protects the nails from brittleness and breakage and treats discoloration.</p>
<p>Although these ingredients effectively manage infections, they are potent enough to heal and rejuvenate the skin and nails with nutrient delivery. Still, the manufacturer wants the customer to consult once with the doctor before using it to prevent any other skin conditions or verify with the label for ingredients and their effects.</p>
<h3 style="text-align: left;"><a href="https://www.healthsupplement24x7.com/get-keratone"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>To Learn More about Keratone Ingredients in Detail, Click Here to Head to Its Official Website</strong></span></span></span></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjT3xcww5iE3czsLubaDWzf-uBPPj6PWNUF7xEerds5cvZab3oULLi25V6XEv8R5HvNOXwiUo9CojNa8JESKO3qzTEL93A1vmEThDKh5Qpv900j5JzOgah6i8nJViAxtxFfK1Y08AYG8BUAOqhVAYzEEsZvPQ0VC11bUeoriAvh2G7bgiUbZLUK-ktXqHo_/w640-h334/Keratone%20Toenail%20Fungus%205.png" alt="" width="640" height="334" border="0" data-original-height="731" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>Health Benefits Of Using <a href="https://keratone.webflow.io/">Keratone</a><br /></strong></h2>
<p><a href="https://haitiliberte.com/advert/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nailsspam-or-legit/">Keratone</a> is a powerful nail health supplement that promises to promote healthy skin and nails in individuals. It is mentioned on the supplement’s official website that Keratone can help you deal better with fungal infections because it is rich in antibacterial properties.</p>
<p>If you want healthy nails, this is the right supplement for you. Keratone has the following health benefits that can enhance the quality of your skin and nails.</p>
<p><strong>Treat Nail Fungus:</strong> The Keratone nail fungus eliminator uses natural ingredients rich in antifungal properties to help prevent fungal infection. This product may help you treat toenail fungus and even preserve your nail keratin to enjoy better nail health.</p>
<p>Keratone supplement hydrates the cuticles and moisturizes the area to prevent you from growing nail fungus. This product can reduce the itchiness associated with toenail fungus. It also strengthens and nourishes your nails so that you don’t get brittle nails which are an easy breeding ground for nail fungus.</p>
<p>All the ingredients in Keratone, including lavender oil and almond oil, are proven to support healthy nails.</p>
<p><strong>Boosts Skin Health:</strong> Keratone is one of the only products on the market to boost skin’s natural immunity. This product has antibacterial properties that can reduce the risk of developing foot fungus, nail fungus, or toenail fungus. It can also protect your skin against skin infections that come with fungal infections.</p>
<p>All the Keratone ingredients, like aloe vera, tea tree oil, lemongrass oil, vitamin E, lavender oil and flaxseed oil, among many others, are extremely valuable for your skin and nail health. They are rich in powerful antioxidants that can help your body deal better with a fungal infection.</p>
<p><strong>Combats Fungal Infections In Nails And Skin:</strong> Keratone supplement improves the health of your nails and skin by strengthening your body to prevent fungal infections. The ingredients of this natural product contain strong antifungal properties that empower them to treat fungal infections to some extent.</p>
<p>If you are unable to find time to take care of your nails and skin, Keratone can do the trick for you by stopping fungal growth and treating nail fungus. This product can target the root cause of nail fungus by delivering essential nutrients.</p>
<p>The ingredients found in Keratone, like lavender oil, vitamin E, tea tree essential oil and almond oil, have been used as natural remedies for improving nail health for centuries.</p>
<p><strong>Reverses Nail And Skin Aging:</strong> The Keratone nail health formula is rich in antibacterial properties that allow it to reduce the incidences of nail fungus and foot fungus to the bare minimum. Once the problem of toenail fungus is solved, you can enjoy healthy nails.</p>
<p>This product helps prevent skin aging by providing a lustrous shine with the help of natural ingredients. Kerasentials can minimize skin infections so you can grow healthy nails. It can protect the skin from fungal infections.</p>
<p><strong>Reduces Inflammation:</strong> According to the official website of Keratone, it can target the root cause of nail fungus and reduce the inflammation that comes with it. While treating toenail fungus, over-the-counter medications often induce side effects like excessive inflammation. But, with the powerful combination of ingredients like lavender oil and tea tree essential oil, you don’t get that.</p>
<p>This natural product has potent anti-inflammatory properties that help treat toenail fungus and promote nail health efficiently. It can preserve your nail keratin so that you stay away from developing nail fungus.</p>
<p><strong>Boosts Skin Immunity:</strong> Keratone uses a doctor-formulated blend to help you in treating nail fungus and soothe irritated skin. This toenail fungus oil works by targeting the root cause of toenail fungus that compromises the skin’s immunity.</p>
<p>Keratone is a powerful product that delivers nutrients to your nails and skin from ingredients like lavender, clove bud, organic flaxseed, tea tree, and lemongrass oils. With the help of essential vitamins and minerals, you can strengthen your skin’s immunity.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>IMPORTANT: <span style="background-color: #ffe599;"><span style="color: red;"><a href="https://www.healthsupplement24x7.com/get-keratone">Shocking Truth About Keratone – This May Change Your Mind!</a></span></span></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuDklRDkWAgM3YHBoIVGdyjQGXSQqutuy2oG646lrNKejmepdR6gYigUNN4TM2eSK0R9h6xLqw1FOFXh_h61iKS6Yf3X3zlMXvwY2PPsPdcj4GlY2jw6RkC346StZwenakJop9kgKXfDmLesd32JcVST_XfQqJQYehRZASxF3GrI3Uktrzgg6_noksrUFD/w491-h368/Keratone%20Toenail%20Fungus%209.png" alt="" width="491" height="368" border="0" data-original-height="1050" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>There's Any Drawbacks Of Using <a href="https://sites.google.com/view/keratone-reviews-usa/">Keratone Toenail Fungus</a>?<br /></strong></h2>
<p>Specifically, there are some limitations involved with this Keratone purchase. The customer might not find these original bottles for purchase anywhere else other than its official website. For those already with skin conditions, using it after a physician’s consultation is advisable.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Read This: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-keratone">"More Information From Knowledgeable Expertise of Health Labs Keratone"</a></span></strong></h3>
<h2 style="text-align: left;"><strong>How To Apply <a href="https://groups.google.com/g/keratone-toenail-fungus-reviews/c/km_W3CxL_wE">Keratone</a> On Skin And Nails?</strong></h2>
<p>You don’t have to prepare anything on your own while using Keratone. However, you need to set some time aside to apply it.</p>
<p>Every bottle of <a href="https://keratone-report.clubeo.com/calendar/2023/08/25/keratone-made-in-usa-does-it-really-work-read-real-customer-reviews-here-where-to-buy-spam-or-legit">Keratone</a> comes with an enclosed brush applicator. With the help of this brush, you can apply the oil to your skin and nails. It also contains a cotton swab that you need to use to push the oil into your nails to enhance nail health and do away with nail fungus.</p>
<p>You can also keep an emery board handy to gently file the surface of the nails for enhanced absorption of the oil. Make sure not to file the nail surface abrasively, as it can worsen the nail fungus.</p>
<p>According to the official website of Keratone, you need to use this oil four times a day to treat fungal infections and prevent toenail fungus successfully.</p>
<p>If you are not comfortable applying even the best natural remedies, you can apply lavender essential oil on your sensitive skin and expect great results.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>READ ALSO: <span style="background-color: #ffe599;"><span style="color: red;"><a href="https://www.healthsupplement24x7.com/get-keratone">Does the Keratone Work For Everyone? Before you buy, read real customer reviews and testimonials!!</a></span></span></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTtLXC9xp7EseSw5xJOwZEg_NM7iNq3BO3UQcima8e9JdbByBTjgZQvzqb2q0rEnUadXtZW7BcVxC2P3H2vMDPrx8JVEGo2vhEAfH7DRjD2vs2Rcu34qFewCU1_4UtiYz4yXs7VWmJcO5AiIhZXW5GR7bpDNbX5PzSrGJB_qOfR0d2Cpgn7J1LpW6n3PJ0/w640-h288/Keratone%20Toenail%20Fungus%206.jpg" alt="" width="640" height="288" border="0" data-original-height="717" data-original-width="1588" /></a></div>
<h2 style="text-align: left;"><strong>Where to Order the Genuine <a href="https://lookerstudio.google.com/reporting/314a650b-3dca-42f1-8a60-95acb8eef5a2">Keratone</a> Product? It's Pricing!</strong></h2>
<p>Obviously, the original <a href="https://k2tropfen.clubeo.com/calendar/2023/08/25/keratone-scientific-secret-nursing-formula-to-get-healthy-skin-nails-and-fungus-free-nails-work-or-hoax">Keratone</a> serum container is not available in stores or other online sites. It is available only on the OFFICIAL WEBSITE. It is the only best way to get Keratone legit bottles without any scam. Making this purchase ensures that you will avail the genuine product directly from the manufacturer. It also ensures that several deals and discounts are available with this purchase as a limited-time offer, which is not found anywhere else with scammers or unauthorized sellers.</p>
<p><strong>These are the <a href="https://www.yepdesk.com/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nails">Keratone</a> costs which decline while getting more units simultaneously:</strong></p>
<p><strong>Basic -</strong> 1 Bottle Supply of Keratone USD 69/bottle + SMALL SHIPPING.<span style="color: red;"><br /></span></p>
<p><strong>Popular Pack -</strong> Buy 3 Get Bottle Supply of <a href="https://filmfreeway.com/KeratoneToenailFungusRemoverFungus-FreeFeetYourNailMakeoverRediscoverBeautifulNa">Keratone</a> USD 59/bottle + SMALL SHIPPING + 2 FREE BOUNUSES.</p>
<p><strong>Best Value Pack - </strong>Buy 6 Bottle Supply of Keratone USD 59/bottle + FREE SHIPPING + 2 FREE BOUNUSES.</p>
<p style="text-align: left;">Keratone Payments are made using 256-bit SSL technology to keep information safe and secure, and all orders arrive within a few business days of ordering.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Special Offer: <span style="background-color: #fff2cc; color: red;"><a href="https://www.healthsupplement24x7.com/get-keratone">Click Here To Get Heavy Discount Instantly!!</a></span></strong></h3>
<p><span style="font-family: times;"><span style="font-size: medium;"><span style="color: red;">Good News: Get additional discount on shipping when you checkout with Mastercard or Discover card!</span></span></span></p>
<div class="separator" style="clear: both; text-align: center;">
<p style="text-align: left;"><span style="font-size: medium;"><a style="clear: left; float: left; margin-bottom: 1em; margin-left: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/a/AVvXsEgJqDXBj2s2sKgxhjLGKnDNPxD392fUjUkF8lQbqbuoFZwPHnPE27muXA18Hs1EzbsUHHsPlOR9Njx119fwMPFiCrLv9NlRRfEUdLPeIVlqZmqjexv1dJ0pMoSO6VUtSY89rewM_LiPyGpkGpNCHHdprDSvrWyt6MprtcceNFal6bdDPK_FyvLHnQzy-A" alt="" width="110" height="120" border="0" data-original-height="120" data-original-width="110" /></a><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">APPROVED!</span><br /></strong></span></span></span></p>
<p style="text-align: left;"><span style="font-family: helvetica;"><span style="font-size: small;">Limited supply available. We currently have product in stock and ready to ship within <span style="color: red;">24 hours</span>.</span></span></p>
</div>
<p><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">EXPIRE SOON</span></strong></span></span></p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsWPctkTs0fMM5t7QfuyiqQAAMUYtayIO7iq1-WoolXjhmdswIPx8KuzYXDlDlmvvYiqWkFOIPhDnGMgUZqhVU3Mq3PJU5mKDGAlcot5GUn_-X58R6-U4LZFVSYj8PCM62EGokkj2nk12XMh-p4zCmb39_xociSBq3B5DdZoKtCZXqqR1FtbO-vo3nyyZA/w458-h458/Keratone%20Toenail%20Fungus%208.png" alt="" width="458" height="458" border="0" data-original-height="1400" data-original-width="1400" /></a></div>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSJvORHtAeEI3H2rypjo7v70Cm2j2tC1B-Ja0K1qVp1MEYhmISktm3oeSPvmtOjcgIp6VWYex2WQ2w6gsXFZPdis4AmxfwRGftHtwSK5PNs5-vJjhVZwsNY6SljpUWbanRSWMbVUibr78lOgAkjowIEGQGH8g4my7mrAF8bND5KSQ7K8qU9d1qadr8WA/w327-h97/btn.png" alt="" width="327" height="97" border="0" data-original-height="84" data-original-width="282" /></a></div>
<p style="text-align: center;">By submitting, you affirm to have read and agreed to our <a href="https://www.healthsupplement24x7.com/get-keratone"><span style="color: red;">Terms & Conditions</span></a>.</p>
<p style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><span style="font-size: medium;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>HUGE SAVINGS Get Your Keratone “Get Something OFF” Get 2+1 Offer Hurry Only For 1st User!!</strong></span></span></span></span></a></p>
<h2 style="text-align: left;"><strong><a href="https://colab.research.google.com/drive/1WdGRGBNzQZDY390_oNad-TzrDZfwt19v?usp=sharing">Keratone</a> Reviews – Final Verdict</strong></h2>
<p><a href="https://soundcloud.com/keratone/keratone-toenail-fungus-remover-fungus-free-feet-your-nail-makeover-rediscover-beautiful-nails">Keratone</a> is an excellent product with antifungal properties to eliminate nail fungus. Not only does it treat toenail fungus, but it also prevents you from getting further fungal infections. All the Keratone ingredients are scientifically proven to help treat fungal infections and promote healthy nails.</p>
<p class="ql-align-center" style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-keratone"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHYxn3NMPQqsmKG54OjkQESM8dw8D7zUXtssdLHaaWSYArzmNucZfEfKCOBsnUqZdp6i-enO0zDWtMGF2pKG2MifoTldIDExJOBDWxicPkSeox29VCmqX6Cz2feNaSfYBnC_BHUdfPT1qUGVgSNyn0NtyKxY-V-M-BDbo5jCOW4qSuxwu3TOTA3dSjIQ/s1600/Screenshot%20(1445).png" alt="" width="320" height="114" /></a></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><a href="https://www.healthsupplement24x7.com/get-keratone"><strong>Terms and Conditions</strong></a><strong> | </strong><a href="https://www.healthsupplement24x7.com/get-keratone"><strong>Privacy</strong></a><strong> | </strong><a href="https://www.healthsupplement24x7.com/get-keratone"><strong>Contact Us</strong></a></span></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong>© 2023 <a href="https://www.healthsupplement24x7.com/get-keratone">Keratone</a></strong><strong>. All Rights Reserved.</strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong>Read More;</strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://www.dibiz.com/keratonereport">https://www.dibiz.com/keratonereport</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://keratone.cgsociety.org/y8u6/keratone-toenail-fun">https://keratone.cgsociety.org/y8u6/keratone-toenail-fun</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://keratone.contently.com/?public_only=true">https://keratone.contently.com/?public_only=true</a></strong></span></p> |
Addkrosb/Ai_bot | 2023-08-26T09:58:47.000Z | [
"license:openrail",
"region:us"
] | Addkrosb | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.1-l2-7b | 2023-08-27T12:43:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zarafusionex-1.1-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zarafusionex-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.1-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.1-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T09:58:58.682404](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.1-l2-7b/blob/main/results_2023-08-26T09%3A58%3A58.682404.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5226913662884187,\n\
\ \"acc_stderr\": 0.03498320172310386,\n \"acc_norm\": 0.5262658023755431,\n\
\ \"acc_norm_stderr\": 0.03496787551566934,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5066113763426128,\n\
\ \"mc2_stderr\": 0.015349039503319952\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636586,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212865\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6046604262099183,\n\
\ \"acc_stderr\": 0.004879242848473458,\n \"acc_norm\": 0.7933678550089623,\n\
\ \"acc_norm_stderr\": 0.004040617668261035\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n\
\ \"acc_stderr\": 0.02800913812540039,\n \"acc_norm\": 0.5870967741935483,\n\
\ \"acc_norm_stderr\": 0.02800913812540039\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.032478490123081544,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.032478490123081544\n },\n \"harness|hendrycksTest-high_school_physics|5\"\
: {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n\
\ \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115072,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115072\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.028384256704883037,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.028384256704883037\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543454,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543454\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36962190352020863,\n\
\ \"acc_stderr\": 0.012328445778575253,\n \"acc_norm\": 0.36962190352020863,\n\
\ \"acc_norm_stderr\": 0.012328445778575253\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5212418300653595,\n \"acc_stderr\": 0.02020957238860025,\n \
\ \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.02020957238860025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458619,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458619\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5066113763426128,\n\
\ \"mc2_stderr\": 0.015349039503319952\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zarafusionex-1.1-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|arc:challenge|25_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hellaswag|10_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:58:58.682404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T09:58:58.682404.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T09:58:58.682404.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T09:58:58.682404.parquet'
- config_name: results
data_files:
- split: 2023_08_26T09_58_58.682404
path:
- results_2023-08-26T09:58:58.682404.parquet
- split: latest
path:
- results_2023-08-26T09:58:58.682404.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.1-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zarafusionex-1.1-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zarafusionex-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.1-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T09:58:58.682404](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.1-l2-7b/blob/main/results_2023-08-26T09%3A58%3A58.682404.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5226913662884187,
"acc_stderr": 0.03498320172310386,
"acc_norm": 0.5262658023755431,
"acc_norm_stderr": 0.03496787551566934,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5066113763426128,
"mc2_stderr": 0.015349039503319952
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636586,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212865
},
"harness|hellaswag|10": {
"acc": 0.6046604262099183,
"acc_stderr": 0.004879242848473458,
"acc_norm": 0.7933678550089623,
"acc_norm_stderr": 0.004040617668261035
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.02800913812540039,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.02800913812540039
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.032478490123081544,
"acc_norm": 0.5,
"acc_norm_stderr": 0.032478490123081544
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115072,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115072
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258875,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543454,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543454
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36962190352020863,
"acc_stderr": 0.012328445778575253,
"acc_norm": 0.36962190352020863,
"acc_norm_stderr": 0.012328445778575253
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.02020957238860025,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.02020957238860025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458619,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458619
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5066113763426128,
"mc2_stderr": 0.015349039503319952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xuqinyang/BaiduBaike-5.63M | 2023-08-26T11:36:39.000Z | [
"region:us"
] | xuqinyang | null | null | null | 0 | 0 | Entry not found |
goind/mnj | 2023-08-26T10:08:44.000Z | [
"region:us"
] | goind | null | null | null | 0 | 0 | Entry not found |
Toflamus/alpaca_data_raw | 2023-08-26T10:26:33.000Z | [
"region:us"
] | Toflamus | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 19000112
num_examples: 52002
download_size: 11986671
dataset_size: 19000112
---
# Dataset Card for "alpaca_data_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_kajdun__iubaris-13b-v3 | 2023-08-27T12:43:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of kajdun/iubaris-13b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kajdun/iubaris-13b-v3](https://huggingface.co/kajdun/iubaris-13b-v3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kajdun__iubaris-13b-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T10:44:57.625308](https://huggingface.co/datasets/open-llm-leaderboard/details_kajdun__iubaris-13b-v3/blob/main/results_2023-08-26T10%3A44%3A57.625308.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5457651905991454,\n\
\ \"acc_stderr\": 0.03462957237621476,\n \"acc_norm\": 0.5496206901026076,\n\
\ \"acc_norm_stderr\": 0.03461040607243729,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4860621835708466,\n\
\ \"mc2_stderr\": 0.015429990225329837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670456,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.625273849830711,\n\
\ \"acc_stderr\": 0.004830628620181031,\n \"acc_norm\": 0.8177653853813981,\n\
\ \"acc_norm_stderr\": 0.003852488177553968\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330877,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330877\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155196,\n \"\
acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155196\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n\
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n\
\ \"acc_stderr\": 0.015842430835269424,\n \"acc_norm\": 0.7318007662835249,\n\
\ \"acc_norm_stderr\": 0.015842430835269424\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194625,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136215,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136215\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.0301164262965406,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.0301164262965406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4860621835708466,\n\
\ \"mc2_stderr\": 0.015429990225329837\n }\n}\n```"
repo_url: https://huggingface.co/kajdun/iubaris-13b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|arc:challenge|25_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hellaswag|10_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T10:44:57.625308.parquet'
- config_name: results
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- results_2023-08-26T10:44:57.625308.parquet
- split: latest
path:
- results_2023-08-26T10:44:57.625308.parquet
---
# Dataset Card for Evaluation run of kajdun/iubaris-13b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kajdun/iubaris-13b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kajdun/iubaris-13b-v3](https://huggingface.co/kajdun/iubaris-13b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kajdun__iubaris-13b-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T10:44:57.625308](https://huggingface.co/datasets/open-llm-leaderboard/details_kajdun__iubaris-13b-v3/blob/main/results_2023-08-26T10%3A44%3A57.625308.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5457651905991454,
"acc_stderr": 0.03462957237621476,
"acc_norm": 0.5496206901026076,
"acc_norm_stderr": 0.03461040607243729,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4860621835708466,
"mc2_stderr": 0.015429990225329837
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670456,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427
},
"harness|hellaswag|10": {
"acc": 0.625273849830711,
"acc_stderr": 0.004830628620181031,
"acc_norm": 0.8177653853813981,
"acc_norm_stderr": 0.003852488177553968
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.019149093743155196,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.019149093743155196
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7318007662835249,
"acc_stderr": 0.015842430835269424,
"acc_norm": 0.7318007662835249,
"acc_norm_stderr": 0.015842430835269424
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194625,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136215,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.0301164262965406,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.0301164262965406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4860621835708466,
"mc2_stderr": 0.015429990225329837
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
uoplemnaso/RedBoost | 2023-08-26T10:50:53.000Z | [
"license:openrail",
"region:us"
] | uoplemnaso | null | null | null | 0 | 0 | ---
license: openrail
---
|
longevity-genie/bge_large_512_aging_papers_paragraphs | 2023-08-26T11:03:04.000Z | [
"license:openrail",
"region:us"
] | longevity-genie | null | null | null | 0 | 0 | ---
license: openrail
---
|
FinchResearch/CodeSet-SC | 2023-08-29T00:48:51.000Z | [
"region:us"
] | FinchResearch | null | null | null | 0 | 0 | Entry not found |
Plona/Test | 2023-10-03T10:53:14.000Z | [
"license:unknown",
"region:us"
] | Plona | null | null | null | 0 | 0 | ---
license: unknown
---
|
kodinD/White_bg | 2023-08-26T12:14:34.000Z | [
"region:us"
] | kodinD | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 5457078543.84
num_examples: 256240
download_size: 6909179986
dataset_size: 5457078543.84
---
## White background dataset
hello! This is a dataset of images mainly on a white background with text descriptions! |
open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v0 | 2023-09-16T17:05:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HyperbeeAI/Tulpar-7b-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HyperbeeAI/Tulpar-7b-v0](https://huggingface.co/HyperbeeAI/Tulpar-7b-v0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T17:05:33.641696](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v0/blob/main/results_2023-09-16T17-05-33.641696.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3200503355704698,\n\
\ \"em_stderr\": 0.004777351284269766,\n \"f1\": 0.39745910234899495,\n\
\ \"f1_stderr\": 0.004660867839676267,\n \"acc\": 0.38302318192072277,\n\
\ \"acc_stderr\": 0.00841750512181253\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3200503355704698,\n \"em_stderr\": 0.004777351284269766,\n\
\ \"f1\": 0.39745910234899495,\n \"f1_stderr\": 0.004660867839676267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.027293404094010616,\n \
\ \"acc_stderr\": 0.004488095380209751\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.012346914863415308\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HyperbeeAI/Tulpar-7b-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|arc:challenge|25_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T17_05_33.641696
path:
- '**/details_harness|drop|3_2023-09-16T17-05-33.641696.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T17-05-33.641696.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T17_05_33.641696
path:
- '**/details_harness|gsm8k|5_2023-09-16T17-05-33.641696.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T17-05-33.641696.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hellaswag|10_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:16:04.808575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T12:16:04.808575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T12:16:04.808575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T17_05_33.641696
path:
- '**/details_harness|winogrande|5_2023-09-16T17-05-33.641696.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T17-05-33.641696.parquet'
- config_name: results
data_files:
- split: 2023_08_26T12_16_04.808575
path:
- results_2023-08-26T12:16:04.808575.parquet
- split: 2023_09_16T17_05_33.641696
path:
- results_2023-09-16T17-05-33.641696.parquet
- split: latest
path:
- results_2023-09-16T17-05-33.641696.parquet
---
# Dataset Card for Evaluation run of HyperbeeAI/Tulpar-7b-v0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HyperbeeAI/Tulpar-7b-v0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HyperbeeAI/Tulpar-7b-v0](https://huggingface.co/HyperbeeAI/Tulpar-7b-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T17:05:33.641696](https://huggingface.co/datasets/open-llm-leaderboard/details_HyperbeeAI__Tulpar-7b-v0/blob/main/results_2023-09-16T17-05-33.641696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3200503355704698,
"em_stderr": 0.004777351284269766,
"f1": 0.39745910234899495,
"f1_stderr": 0.004660867839676267,
"acc": 0.38302318192072277,
"acc_stderr": 0.00841750512181253
},
"harness|drop|3": {
"em": 0.3200503355704698,
"em_stderr": 0.004777351284269766,
"f1": 0.39745910234899495,
"f1_stderr": 0.004660867839676267
},
"harness|gsm8k|5": {
"acc": 0.027293404094010616,
"acc_stderr": 0.004488095380209751
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.012346914863415308
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_electricity_gosdt_l512_d3_sd1 | 2023-08-26T12:19:14.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 1560840336
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_electricity_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_electricity_gosdt_l512_d3_sd2 | 2023-08-26T12:21:35.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 1560789998
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_electricity_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_electricity_gosdt_l512_d3_sd3 | 2023-08-26T12:27:28.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 1561506514
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_electricity_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Isaak-Carter/Private_v1 | 2023-08-26T12:28:03.000Z | [
"region:us"
] | Isaak-Carter | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq | 2023-08-27T12:43:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of malhajar/Platypus2-70B-instruct-4bit-gptq
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [malhajar/Platypus2-70B-instruct-4bit-gptq](https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T12:30:11.519673](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq/blob/main/results_2023-08-26T12%3A30%3A11.519673.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23568332946534118,\n\
\ \"acc_stderr\": 0.030875990616634128,\n \"acc_norm\": 0.23665349264658486,\n\
\ \"acc_norm_stderr\": 0.030890666475037305,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4955854635237609,\n\
\ \"mc2_stderr\": 0.01695340721579618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301829,\n\
\ \"acc_norm\": 0.2901023890784983,\n \"acc_norm_stderr\": 0.01326157367752077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2560246962756423,\n\
\ \"acc_stderr\": 0.004355436696716298,\n \"acc_norm\": 0.25951005775741887,\n\
\ \"acc_norm_stderr\": 0.0043746991892848605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.19032258064516128,\n \"acc_stderr\": 0.022331707611823088,\n \"\
acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.022331707611823088\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n\
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.032578473844367774,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.032578473844367774\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217878,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217878\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.12037037037037036,\n \"acc_stderr\": 0.02219169594400172,\n \"\
acc_norm\": 0.12037037037037036,\n \"acc_norm_stderr\": 0.02219169594400172\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n\
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859923,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934101,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934101\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.20921985815602837,\n \"acc_stderr\": 0.02426476943998847,\n \
\ \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.02426476943998847\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010112997,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010112997\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4955854635237609,\n\
\ \"mc2_stderr\": 0.01695340721579618\n }\n}\n```"
repo_url: https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|arc:challenge|25_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hellaswag|10_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T12:30:11.519673.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T12:30:11.519673.parquet'
- config_name: results
data_files:
- split: 2023_08_26T12_30_11.519673
path:
- results_2023-08-26T12:30:11.519673.parquet
- split: latest
path:
- results_2023-08-26T12:30:11.519673.parquet
---
# Dataset Card for Evaluation run of malhajar/Platypus2-70B-instruct-4bit-gptq
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [malhajar/Platypus2-70B-instruct-4bit-gptq](https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T12:30:11.519673](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq/blob/main/results_2023-08-26T12%3A30%3A11.519673.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23568332946534118,
"acc_stderr": 0.030875990616634128,
"acc_norm": 0.23665349264658486,
"acc_norm_stderr": 0.030890666475037305,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.4955854635237609,
"mc2_stderr": 0.01695340721579618
},
"harness|arc:challenge|25": {
"acc": 0.2363481228668942,
"acc_stderr": 0.012414960524301829,
"acc_norm": 0.2901023890784983,
"acc_norm_stderr": 0.01326157367752077
},
"harness|hellaswag|10": {
"acc": 0.2560246962756423,
"acc_stderr": 0.004355436696716298,
"acc_norm": 0.25951005775741887,
"acc_norm_stderr": 0.0043746991892848605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.033550453048829226,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.033550453048829226
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220575,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220575
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823088,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823088
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421255,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421255
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.032578473844367774,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.032578473844367774
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217878,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.12037037037037036,
"acc_stderr": 0.02219169594400172,
"acc_norm": 0.12037037037037036,
"acc_norm_stderr": 0.02219169594400172
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398691,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398691
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859923,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934101,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934101
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.02426476943998847,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.02426476943998847
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.022770868010112997,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.022770868010112997
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813292,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813292
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.4955854635237609,
"mc2_stderr": 0.01695340721579618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FASOXO/Saturn_sd | 2023-08-27T12:11:07.000Z | [
"region:us"
] | FASOXO | null | null | null | 0 | 0 | Entry not found |
lv2/Indonesia_LLama | 2023-08-26T23:08:53.000Z | [
"region:us"
] | lv2 | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42278540
num_examples: 49969
download_size: 22157927
dataset_size: 42278540
---
# Dataset Card for "Indonesia_LLama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coralexbadea/monitorul_trial_qa1 | 2023-08-26T12:53:49.000Z | [
"region:us"
] | coralexbadea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 738194
num_examples: 2570
download_size: 344199
dataset_size: 738194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_qa1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Onno/hotel-images-v2 | 2023-08-26T12:59:02.000Z | [
"region:us"
] | Onno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Negative
'1': Positive
splits:
- name: train
num_bytes: 110056190.0
num_examples: 419
download_size: 110061896
dataset_size: 110056190.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hotel-images-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jmaczan/rick-and-morty-scripts-vicuna-1 | 2023-08-26T17:51:49.000Z | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:other",
"cartoon",
"region:us"
] | jmaczan | null | null | null | 0 | 0 | ---
license: other
task_categories:
- text-generation
language:
- en
tags:
- cartoon
pretty_name: Rick and Morty Scripts for Vicuna 1
size_categories:
- 1K<n<10K
---
## Rick and Morty scripts in Vicuna 1 format
---
license: other
---
License as in https://www.kaggle.com/datasets/andradaolteanu/rickmorty-scripts
---
Original dataset by [Andrada](https://www.kaggle.com/andradaolteanu), adjusted to Llama 2 format by [Jędrzej Paweł Maczan](https://maczan.pl) for C-137 project - [Llama 2 7B on Apple M2 fine-tuned to revive Rick ](https://github.com/jmaczan/c-137) |
KeiSumi/SampleTest | 2023-08-26T15:08:44.000Z | [
"region:us"
] | KeiSumi | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
coralexbadea/monitorul_trial_qa300 | 2023-08-26T13:19:31.000Z | [
"region:us"
] | coralexbadea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 619532
num_examples: 2094
download_size: 291058
dataset_size: 619532
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_qa300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UTibetNLP/tibetan_news_classification | 2023-08-26T14:02:08.000Z | [
"language:bo",
"region:us"
] | UTibetNLP | null | null | null | 0 | 0 | ---
language:
- bo
---
# Tibetan News Classification Corpus
**This is the open-sourced training corpus of our [Tibetan BERT Model](https://huggingface.co/UTibetNLP/tibetan_bert).**
## Citation
Please cite our [paper](https://dl.acm.org/doi/10.1145/3548608.3559255) if you use this training corpus or the model:
```
@inproceedings{10.1145/3548608.3559255,
author = {Zhang, Jiangyan and Kazhuo, Deji and Gadeng, Luosang and Trashi, Nyima and Qun, Nuo},
title = {Research and Application of Tibetan Pre-Training Language Model Based on BERT},
year = {2022},
isbn = {9781450397179},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3548608.3559255},
doi = {10.1145/3548608.3559255},
abstract = {In recent years, pre-training language models have been widely used in the field of natural language processing, but the research on Tibetan pre-training language models is still in the exploratory stage. To promote the further development of Tibetan natural language processing and effectively solve the problem of the scarcity of Tibetan annotation data sets, the article studies the Tibetan pre-training language model based on BERT. First, given the characteristics of the Tibetan language, we constructed a data set for the BERT pre-training language model and downstream text classification tasks. Secondly, construct a small-scale Tibetan BERT pre-training language model to train it. Finally, the performance of the model was verified through the downstream task of Tibetan text classification, and an accuracy rate of 86\% was achieved on the task of text classification. Experiments show that the model we built has a significant effect on the task of Tibetan text classification.},
booktitle = {Proceedings of the 2022 2nd International Conference on Control and Intelligent Robotics},
pages = {519–524},
numpages = {6},
location = {Nanjing, China},
series = {ICCIR '22}
}
``` |
junghoonson/openpayments-cms | 2023-08-26T13:53:34.000Z | [
"license:unknown",
"region:us"
] | junghoonson | null | null | null | 0 | 0 | ---
license: unknown
---
|
reichenbach/news_classification_kaggle_dt | 2023-08-26T13:54:28.000Z | [
"region:us"
] | reichenbach | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: link
dtype: string
- name: headline
dtype: string
- name: category
dtype: string
- name: short_description
dtype: string
- name: authors
dtype: string
- name: date
dtype: timestamp[s]
splits:
- name: train
num_bytes: 56378761.39201153
num_examples: 167621
- name: test
num_bytes: 14094942.60798847
num_examples: 41906
download_size: 44996856
dataset_size: 70473704.0
---
# Dataset Card for "news_classification_kaggle_dt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
focia/yt_main_image_dataset | 2023-08-26T15:10:53.000Z | [
"region:us"
] | focia | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: videoId
dtype: string
- name: imagePath
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 16042609970.48
num_examples: 114680
download_size: 949694879
dataset_size: 16042609970.48
---
# Dataset Card for "yt_main_image_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SpyderG7/Fousey | 2023-08-26T14:20:52.000Z | [
"region:us"
] | SpyderG7 | null | null | null | 0 | 0 | Entry not found |
Asor/guanaco-llama2-200 | 2023-08-26T14:23:50.000Z | [
"license:mit",
"region:us"
] | Asor | null | null | null | 0 | 0 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 338808
num_examples: 200
download_size: 201257
dataset_size: 338808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Toflamus/alpaca_data_split | 2023-08-26T14:25:07.000Z | [
"region:us"
] | Toflamus | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 17099808.501826853
num_examples: 46801
- name: test
num_bytes: 1900303.4981731472
num_examples: 5201
download_size: 12068449
dataset_size: 19000112.0
---
# Dataset Card for "alpaca_data_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanHE/course_rate | 2023-08-26T14:51:27.000Z | [
"region:us"
] | ChanHE | null | null | null | 0 | 0 | |
mHossain/merge_new_para_detection_data_v7 | 2023-08-26T15:02:30.000Z | [
"region:us"
] | mHossain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 14605326.9
num_examples: 86400
- name: test
num_bytes: 1622814.1
num_examples: 9600
download_size: 7336900
dataset_size: 16228141.0
---
# Dataset Card for "merge_new_para_detection_data_v7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
focia/yt_full_image_dataset | 2023-08-26T15:45:24.000Z | [
"region:us"
] | focia | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: channelId
dtype: string
- name: videoId
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: views
dtype: int64
- name: url
dtype: string
- name: publishDate
dtype: timestamp[ns]
- name: lengthSeconds
dtype: int64
- name: subscriberCount
dtype: int64
- name: videoCount
dtype: int64
- name: isVerified
dtype: bool
- name: keywords
dtype: string
- name: country
dtype: string
- name: imagePath
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 16107504583.48
num_examples: 114680
download_size: 950988308
dataset_size: 16107504583.48
---
# Dataset Card for "yt_full_image_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RikoteMaster/llama2_classifying_and_explainning | 2023-08-26T15:15:22.000Z | [
"region:us"
] | RikoteMaster | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: Explanation
dtype: string
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 51981712
num_examples: 47512
download_size: 16818458
dataset_size: 51981712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_classifying_and_explainning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RikoteMaster/llama2_classifying_and_explainning_v2 | 2023-08-26T15:16:45.000Z | [
"region:us"
] | RikoteMaster | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Explanation
dtype: string
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 51981712
num_examples: 47512
download_size: 16818458
dataset_size: 51981712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_classifying_and_explainning_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/based_1 | 2023-08-26T15:19:05.000Z | [
"region:us"
] | TinyPixel | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: human
dtype: string
- name: bot
dtype: string
splits:
- name: train
num_bytes: 50290
num_examples: 176
download_size: 36285
dataset_size: 50290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "based_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
my-demo-org/embedded_faqs_medicare | 2023-08-26T15:27:55.000Z | [
"region:us"
] | my-demo-org | null | null | null | 0 | 0 | Entry not found |
sohaibkhan1192/test | 2023-08-26T15:39:58.000Z | [
"region:us"
] | sohaibkhan1192 | null | null | null | 0 | 0 | Entry not found |
igorvanw/monthly | 2023-08-26T15:46:11.000Z | [
"license:afl-3.0",
"region:us"
] | igorvanw | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
SameerMahajan/marathi_numbers-1-100 | 2023-08-26T16:05:37.000Z | [
"region:us"
] | SameerMahajan | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 214224810.8
num_examples: 2730
download_size: 16138632
dataset_size: 214224810.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marathi_numbers-1-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiejoseph/wikitext-zh-yue | 2023-08-26T16:37:07.000Z | [
"license:cc-by-3.0",
"region:us"
] | indiejoseph | null | null | null | 0 | 0 | ---
license: cc-by-3.0
---
|
mHossain/merge_new_para_detection_data_v8 | 2023-08-26T16:34:13.000Z | [
"region:us"
] | mHossain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 12768876.9
num_examples: 75600
- name: test
num_bytes: 1418764.1
num_examples: 8400
download_size: 6418901
dataset_size: 14187641.0
---
# Dataset Card for "merge_new_para_detection_data_v8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Augusto777/OCT2017 | 2023-08-26T16:40:47.000Z | [
"region:us"
] | Augusto777 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': CNV
'1': DME
'2': DRUSEN
'3': NORMAL
splits:
- name: train
num_bytes: 34491675.0
num_examples: 480
download_size: 25828769
dataset_size: 34491675.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OCT2017"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/merge_new_para_detection_data_v9 | 2023-08-26T16:46:58.000Z | [
"region:us"
] | mHossain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 10951576.2
num_examples: 64800
- name: test
num_bytes: 1216841.8
num_examples: 7200
download_size: 5498122
dataset_size: 12168418.0
---
# Dataset Card for "merge_new_para_detection_data_v9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_california_gosdt_l512_d3 | 2023-08-26T16:54:59.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5948000000
num_examples: 100000
- name: validation
num_bytes: 594800000
num_examples: 10000
download_size: 2215522994
dataset_size: 6542800000
---
# Dataset Card for "autotree_automl_california_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_covertype_gosdt_l512_d3_sd3 | 2023-08-26T17:10:49.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2014669554
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_covertype_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_covertype_gosdt_l512_d3_sd1 | 2023-08-26T17:24:24.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2015253906
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_covertype_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZhangCNN/MindData_zh | 2023-08-26T17:37:02.000Z | [
"license:apache-2.0",
"region:us"
] | ZhangCNN | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
yzhuang/autotree_automl_covertype_gosdt_l512_d3_sd2 | 2023-08-26T17:33:35.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2014047838
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_covertype_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
houck2040/detr_249_test | 2023-08-26T17:34:21.000Z | [
"license:mit",
"region:us"
] | houck2040 | null | null | null | 0 | 0 | ---
license: mit
---
|
Rahul89/data_set_property | 2023-08-26T17:35:38.000Z | [
"license:apache-2.0",
"region:us"
] | Rahul89 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Sushantmenon123/Kathakali | 2023-08-26T17:41:28.000Z | [
"region:us"
] | Sushantmenon123 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 71728.0
num_examples: 5
download_size: 72596
dataset_size: 71728.0
---
# Dataset Card for "Kathakali"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/PuffedConvo | 2023-08-26T17:44:54.000Z | [
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | Entry not found |
NobodyExistsOnTheInternet/PuffedLIMAsub4000 | 2023-08-28T05:42:41.000Z | [
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | PuffedConvo is a mix of Puffin and ConvoEvol, with a total of 11.6k instruct pairs.
It has been filtered for 4000 tokens on the LLAMA-2-13b-HF encoder.
|
EvanLong/languages-translate-chinese | 2023-08-26T18:09:13.000Z | [
"license:openrail",
"region:us"
] | EvanLong | null | null | null | 0 | 0 | ---
license: openrail
---
|
jmaczan/rick-and-morty-scripts-llama-2 | 2023-08-26T17:59:45.000Z | [
"license:other",
"region:us"
] | jmaczan | null | null | null | 0 | 0 | ---
license: other
---
|
NobodyExistsOnTheInternet/LIMAalpaca | 2023-08-26T18:06:28.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
tuanvietarc/video | 2023-08-27T05:09:13.000Z | [
"region:us"
] | tuanvietarc | null | null | null | 0 | 0 | Entry not found |
ameerazam08/images | 2023-08-26T18:38:22.000Z | [
"region:us"
] | ameerazam08 | null | null | null | 0 | 0 | Entry not found |
Katharinelw/Book | 2023-08-26T18:56:47.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | Katharinelw | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
Katharinelw/Kk | 2023-08-26T19:00:44.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | Katharinelw | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
rhasspy/voice-datasets | 2023-08-26T19:10:51.000Z | [
"region:us"
] | rhasspy | null | null | null | 0 | 0 | Entry not found |
SaiedAlshahrani/Moroccan_Arabic_Wikipedia_20230101_nobots | 2023-08-26T19:10:56.000Z | [
"region:us"
] | SaiedAlshahrani | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7596217
num_examples: 5396
download_size: 2958669
dataset_size: 7596217
---
# Dataset Card for "Moroccan_Arabic_Wikipedia_20230101_nobots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf | 2023-09-22T19:25:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-34b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:25:35.748901](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf/blob/main/results_2023-09-22T19-25-35.748901.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298454,\n \"f1\": 0.047479026845637595,\n\
\ \"f1_stderr\": 0.0011836496363564649,\n \"acc\": 0.46077036907609203,\n\
\ \"acc_stderr\": 0.011810507836002033\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298454,\n\
\ \"f1\": 0.047479026845637595,\n \"f1_stderr\": 0.0011836496363564649\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2001516300227445,\n \
\ \"acc_stderr\": 0.011021119022510191\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7213891081294396,\n \"acc_stderr\": 0.012599896649493876\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-34b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|arc:challenge|25_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|drop|3_2023-09-22T19-25-35.748901.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-25-35.748901.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-25-35.748901.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-25-35.748901.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hellaswag|10_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T20:08:27.081225.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T20:08:27.081225.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|winogrande|5_2023-09-22T19-25-35.748901.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-25-35.748901.parquet'
- config_name: results
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- results_2023-08-26T20:08:27.081225.parquet
- split: 2023_09_22T19_25_35.748901
path:
- results_2023-09-22T19-25-35.748901.parquet
- split: latest
path:
- results_2023-09-22T19-25-35.748901.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-34b-Python-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-34b-Python-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:25:35.748901](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf/blob/main/results_2023-09-22T19-25-35.748901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298454,
"f1": 0.047479026845637595,
"f1_stderr": 0.0011836496363564649,
"acc": 0.46077036907609203,
"acc_stderr": 0.011810507836002033
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298454,
"f1": 0.047479026845637595,
"f1_stderr": 0.0011836496363564649
},
"harness|gsm8k|5": {
"acc": 0.2001516300227445,
"acc_stderr": 0.011021119022510191
},
"harness|winogrande|5": {
"acc": 0.7213891081294396,
"acc_stderr": 0.012599896649493876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
johanmichel/Test | 2023-08-26T20:19:40.000Z | [
"region:us"
] | johanmichel | null | null | null | 0 | 0 | Entry not found |
alayaran/bodo_english_parallel_test | 2023-08-26T20:54:45.000Z | [
"license:mit",
"region:us"
] | alayaran | Bodo and English Parallel Sentences
2 languages, 3 bitexts
;) @alayaran | In progress | null | 0 | 0 | ---
license: mit
---
|
alayaran/bodo_english_parallel_valid | 2023-08-26T21:00:37.000Z | [
"license:mit",
"region:us"
] | alayaran | Bodo and English Parallel Sentences
2 languages, 3 bitexts
;) @alayaran | In progress | null | 0 | 0 | ---
license: mit
---
|
Nagabhushan27/TestFirst | 2023-08-26T21:09:26.000Z | [
"region:us"
] | Nagabhushan27 | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jorgeortizfuentes/sfl_automatization_spanish_attitude | 2023-08-26T21:14:28.000Z | [
"region:us"
] | jorgeortizfuentes | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: att_tags
sequence:
class_label:
names:
'0': B-Affect
'1': I-Positive
'2': B-Positive
'3': B-Judgment (J1)
'4': B-tenacity (J3)
'5': B-Negative
'6': I-capacity (J3)
'7': I-Appreciation
'8': B-capacity (J3)
'9': I-tenacity (J3)
'10': B-Social Esteem (J2)
'11': I-Negative
'12': O
'13': B-Appreciation
'14': I-Affect
'15': B-Social Sanction (J2)
'16': I-propriety (J3)
'17': I-veracity (J3)
'18': B-normality (J3)
'19': I-Social Sanction (J2)
'20': B-propriety (J3)
'21': B-veracity (J3)
'22': I-normality (J3)
'23': I-Judgment (J1)
'24': I-Social Esteem (J2)
splits:
- name: train
num_bytes: 1492776.194221509
num_examples: 1993
- name: validation
num_bytes: 373755.8057784912
num_examples: 499
download_size: 486331
dataset_size: 1866532.0
---
# Dataset Card for "sfl_automatization_spanish_attitude"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.