id
stringlengths
2
115
lastModified
stringlengths
24
24
tags
list
author
stringlengths
2
42
description
stringlengths
0
6.67k
citation
stringlengths
0
10.7k
likes
int64
0
3.66k
downloads
int64
0
8.89M
created
timestamp[us]
card
stringlengths
11
977k
card_len
int64
11
977k
embeddings
list
ostapeno/gpt4_alpaca
2023-10-09T21:05:06.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:05:03
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 44542771 num_examples: 52002 download_size: 24271598 dataset_size: 44542771 --- # Dataset Card for "gpt4_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
497
[ [ -0.052001953125, -0.025634765625, 0.024749755859375, 0.019927978515625, -0.03302001953125, -0.011077880859375, 0.03582763671875, -0.0228424072265625, 0.06500244140625, 0.026702880859375, -0.0557861328125, -0.05828857421875, -0.054168701171875, -0.01225280761...
ostapeno/code_alpaca
2023-10-09T21:05:09.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:05:07
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 7830075 num_examples: 20022 download_size: 3538209 dataset_size: 7830075 --- # Dataset Card for "code_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
494
[ [ -0.050506591796875, -0.023773193359375, 0.0053863525390625, 0.0274658203125, -0.0220947265625, -0.00490570068359375, 0.0221710205078125, -0.0182647705078125, 0.07501220703125, 0.034515380859375, -0.047943115234375, -0.062469482421875, -0.043212890625, -0.019...
YaHi/chinese_AAAI_Math
2023-10-09T21:06:28.000Z
[ "region:us" ]
YaHi
null
null
0
0
2023-10-09T21:06:27
--- dataset_info: features: - name: dataset_version dtype: timestamp[s] - name: queId dtype: string - name: difficulty dtype: string - name: qtype dtype: string - name: problem dtype: string - name: knowledge_point_routes sequence: string splits: - name: train num_bytes: 2911523 num_examples: 7436 download_size: 1485592 dataset_size: 2911523 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "chinese_AAAI_Math" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
659
[ [ -0.0313720703125, -0.023040771484375, -0.00011879205703735352, 0.037628173828125, -0.0028553009033203125, -0.0020122528076171875, 0.0163116455078125, -0.01335906982421875, 0.048797607421875, 0.013427734375, -0.0556640625, -0.052490234375, -0.0211181640625, -...
nightmare-nectarine/segmentation-carla-driving
2023-10-12T01:36:11.000Z
[ "size_categories:10B<n<100B", "language:en", "license:mit", "Autonomous Driving", "CARLA Simulator", "ImitationLearning", "region:us" ]
nightmare-nectarine
null
null
0
0
2023-10-09T21:15:59
--- license: mit language: - en tags: - Autonomous Driving - CARLA Simulator - ImitationLearning size_categories: - 10B<n<100B pretty_name: S --- This dataset consists of 80 episodes of driving data collected using an autopilot agent in CARLA simulator for training imitation learning models for autonomous driving tasks. Each frame is structured as follows: ``` frame_data = { 'frame': the frame index, 'hlc': an integer representing the high-level command, 'light': an integer representing current traffic light status, 'controls': an array of [throttle, steer, brake], 'measurements': current speed in km/h, 'rgb': rgb camera image, 'segmentation': ground truth segmentation image, } ``` This dataset is used in [this project](https://github.com/TheRoboticsClub/gsoc2023-Meiqi_Zhao) and the trained models are available [here](https://huggingface.co/nightmare-nectarine/segmentation-based-imitation-learning-in-CARLA). Check out the [example code](https://github.com/TheRoboticsClub/gsoc2023-Meiqi_Zhao/blob/main/src/ModifiedDeepestLSTMTinyPilotNet/utils/load_dataset.py) for loading the dataset.
1,193
[ [ -0.0236968994140625, -0.031982421875, 0.037322998046875, 0.0126953125, -0.00420379638671875, -0.00667572021484375, 0.019287109375, -0.01531982421875, 0.006587982177734375, 0.035797119140625, -0.06024169921875, -0.032806396484375, -0.0177764892578125, -0.0137...
autoevaluate/autoeval-eval-squad_v2-squad_v2-6e4e67-94066145983
2023-10-09T21:22:42.000Z
[ "region:us" ]
autoevaluate
null
null
0
0
2023-10-09T21:22:38
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
ostapeno/tulu_v2_cot_subset
2023-10-09T21:23:58.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:23:55
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 57705790 num_examples: 50000 download_size: 25971959 dataset_size: 57705790 --- # Dataset Card for "tulu_v2_cot_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
504
[ [ -0.033966064453125, -0.019439697265625, 0.0010042190551757812, 0.014434814453125, -0.0299530029296875, 0.0126953125, 0.0215606689453125, -0.0144805908203125, 0.0423583984375, 0.0262451171875, -0.038482666015625, -0.043182373046875, -0.049072265625, -0.024719...
ostapeno/tulu_v2_flan_v2_subset
2023-10-09T21:24:03.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:23:58
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 111227584 num_examples: 50000 download_size: 64903414 dataset_size: 111227584 --- # Dataset Card for "tulu_v2_flan_v2_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
510
[ [ -0.03271484375, -0.0231170654296875, -0.005970001220703125, 0.01059722900390625, -0.0206298828125, 0.00222015380859375, 0.022857666015625, -0.0159149169921875, 0.053466796875, 0.0278472900390625, -0.047698974609375, -0.0242767333984375, -0.037750244140625, -...
ostapeno/tulu_v2_oasst1_subset
2023-10-09T21:24:26.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:24:24
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 12306024 num_examples: 7708 download_size: 7059985 dataset_size: 12306024 --- # Dataset Card for "tulu_v2_oasst1_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
505
[ [ -0.0242462158203125, -0.0273895263671875, -0.0021877288818359375, 0.004627227783203125, -0.02569580078125, 0.0001329183578491211, 0.036895751953125, -0.00913238525390625, 0.048614501953125, 0.0234832763671875, -0.0489501953125, -0.03045654296875, -0.042877197265...
ostapeno/tulu_v2_gpt4_alpaca_subset
2023-10-09T21:24:33.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:24:31
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 16994301 num_examples: 20000 download_size: 9302507 dataset_size: 16994301 --- # Dataset Card for "tulu_v2_gpt4_alpaca_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
511
[ [ -0.04241943359375, -0.026214599609375, 0.006603240966796875, 0.01496124267578125, -0.03564453125, -0.0006952285766601562, 0.034820556640625, -0.0181427001953125, 0.055694580078125, 0.0232391357421875, -0.048065185546875, -0.040374755859375, -0.049713134765625, ...
ostapeno/tulu_v2_code_alpaca_subset
2023-10-09T21:24:35.000Z
[ "region:us" ]
ostapeno
null
null
0
0
2023-10-09T21:24:34
--- dataset_info: features: - name: dataset dtype: string - name: id dtype: string - name: messages list: - name: role dtype: string - name: content dtype: string splits: - name: train num_bytes: 7823498 num_examples: 20022 download_size: 3528838 dataset_size: 7823498 --- # Dataset Card for "tulu_v2_code_alpaca_subset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
509
[ [ -0.039398193359375, -0.024261474609375, -0.0027370452880859375, 0.0193939208984375, -0.032135009765625, -0.0012311935424804688, 0.0318603515625, -0.0167694091796875, 0.056304931640625, 0.0299224853515625, -0.044677734375, -0.04132080078125, -0.04425048828125, ...
open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state
2023-10-09T21:39:54.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T21:38:56
--- pretty_name: Evaluation run of xiaol/RWKV-v4-raven-14B-one-state dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [xiaol/RWKV-v4-raven-14B-one-state](https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-09T21:38:42.028709](https://huggingface.co/datasets/open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state/blob/main/results_2023-10-09T21-38-42.028709.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33924685524661535,\n\ \ \"acc_stderr\": 0.03400094010286168,\n \"acc_norm\": 0.3432206955736541,\n\ \ \"acc_norm_stderr\": 0.03399555734342263,\n \"mc1\": 0.2484700122399021,\n\ \ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.37298301233557335,\n\ \ \"mc2_stderr\": 0.014007983938605419\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.014397070564409172,\n\ \ \"acc_norm\": 0.45733788395904434,\n \"acc_norm_stderr\": 0.01455810654392407\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5230033857797252,\n\ \ \"acc_stderr\": 0.004984497871025246,\n \"acc_norm\": 0.714797849034057,\n\ \ \"acc_norm_stderr\": 0.00450587908460685\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\ \ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.34074074074074073,\n\ \ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436004,\n\ \ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436004\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.029711421880107915,\n\ \ \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.029711421880107915\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\ \ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\ \ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\ \ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\ \ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\ \ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853443,\n\ \ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853443\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\ \ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\ \ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.3419354838709677,\n\ \ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n\ \ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\ \ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178815,\n \"\ acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178815\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048575,\n\ \ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048575\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\ \ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\ \ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"\ acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3798165137614679,\n \"acc_stderr\": 0.020808825617866244,\n \"\ acc_norm\": 0.3798165137614679,\n \"acc_norm_stderr\": 0.020808825617866244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\ acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.4166666666666667,\n \"acc_stderr\": 0.03460228327239171,\n \"\ acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03460228327239171\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.5232067510548524,\n \"acc_stderr\": 0.032512152011410174,\n \ \ \"acc_norm\": 0.5232067510548524,\n \"acc_norm_stderr\": 0.032512152011410174\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\ \ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\ \ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"\ acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\ \ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.4351851851851852,\n\ \ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\ \ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\ \ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\ \ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.44871794871794873,\n\ \ \"acc_stderr\": 0.032583346493868806,\n \"acc_norm\": 0.44871794871794873,\n\ \ \"acc_norm_stderr\": 0.032583346493868806\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.41507024265644954,\n\ \ \"acc_stderr\": 0.017620137003655268,\n \"acc_norm\": 0.41507024265644954,\n\ \ \"acc_norm_stderr\": 0.017620137003655268\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546534,\n\ \ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546534\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\ \ \"acc_stderr\": 0.014400296429225596,\n \"acc_norm\": 0.24581005586592178,\n\ \ \"acc_norm_stderr\": 0.014400296429225596\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.026568921015457152,\n\ \ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.026568921015457152\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3408360128617363,\n\ \ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.3408360128617363,\n\ \ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.33641975308641975,\n \"acc_stderr\": 0.026289734945952926,\n\ \ \"acc_norm\": 0.33641975308641975,\n \"acc_norm_stderr\": 0.026289734945952926\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \ \ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33116036505867014,\n\ \ \"acc_stderr\": 0.012020128195985757,\n \"acc_norm\": 0.33116036505867014,\n\ \ \"acc_norm_stderr\": 0.012020128195985757\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.02747227447323382,\n\ \ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.02747227447323382\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.3235294117647059,\n \"acc_stderr\": 0.018926082916083393,\n \ \ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.018926082916083393\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\ \ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.37272727272727274,\n\ \ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174913,\n\ \ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174913\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n\ \ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n\ \ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03811079669833531,\n\ \ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03811079669833531\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\ \ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.37298301233557335,\n\ \ \"mc2_stderr\": 0.014007983938605419\n }\n}\n```" repo_url: https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|arc:challenge|25_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hellaswag|10_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T21_38_42.028709 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T21-38-42.028709.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T21-38-42.028709.parquet' - config_name: results data_files: - split: 2023_10_09T21_38_42.028709 path: - results_2023-10-09T21-38-42.028709.parquet - split: latest path: - results_2023-10-09T21-38-42.028709.parquet --- # Dataset Card for Evaluation run of xiaol/RWKV-v4-raven-14B-one-state ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [xiaol/RWKV-v4-raven-14B-one-state](https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-09T21:38:42.028709](https://huggingface.co/datasets/open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state/blob/main/results_2023-10-09T21-38-42.028709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.33924685524661535, "acc_stderr": 0.03400094010286168, "acc_norm": 0.3432206955736541, "acc_norm_stderr": 0.03399555734342263, "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520681, "mc2": 0.37298301233557335, "mc2_stderr": 0.014007983938605419 }, "harness|arc:challenge|25": { "acc": 0.41467576791808874, "acc_stderr": 0.014397070564409172, "acc_norm": 0.45733788395904434, "acc_norm_stderr": 0.01455810654392407 }, "harness|hellaswag|10": { "acc": 0.5230033857797252, "acc_stderr": 0.004984497871025246, "acc_norm": 0.714797849034057, "acc_norm_stderr": 0.00450587908460685 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.34074074074074073, "acc_stderr": 0.040943762699967926, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2236842105263158, "acc_stderr": 0.033911609343436004, "acc_norm": 0.2236842105263158, "acc_norm_stderr": 0.033911609343436004 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.36981132075471695, "acc_stderr": 0.029711421880107915, "acc_norm": 0.36981132075471695, "acc_norm_stderr": 0.029711421880107915 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2916666666666667, "acc_stderr": 0.03800968060554858, "acc_norm": 0.2916666666666667, "acc_norm_stderr": 0.03800968060554858 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.03775251680686371, "acc_norm": 0.17, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.30057803468208094, "acc_stderr": 0.0349610148119118, "acc_norm": 0.30057803468208094, "acc_norm_stderr": 0.0349610148119118 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929777, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929777 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3404255319148936, "acc_stderr": 0.03097669299853443, "acc_norm": 0.3404255319148936, "acc_norm_stderr": 0.03097669299853443 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.0433913832257986, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.0433913832257986 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3448275862068966, "acc_stderr": 0.03960933549451208, "acc_norm": 0.3448275862068966, "acc_norm_stderr": 0.03960933549451208 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.02226181769240017, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.02226181769240017 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147126, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147126 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3419354838709677, "acc_stderr": 0.026985289576552742, "acc_norm": 0.3419354838709677, "acc_norm_stderr": 0.026985289576552742 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.26108374384236455, "acc_stderr": 0.030903796952114482, "acc_norm": 0.26108374384236455, "acc_norm_stderr": 0.030903796952114482 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.48484848484848486, "acc_stderr": 0.03902551007374448, "acc_norm": 0.48484848484848486, "acc_norm_stderr": 0.03902551007374448 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.29292929292929293, "acc_stderr": 0.03242497958178815, "acc_norm": 0.29292929292929293, "acc_norm_stderr": 0.03242497958178815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.37305699481865284, "acc_stderr": 0.03490205592048575, "acc_norm": 0.37305699481865284, "acc_norm_stderr": 0.03490205592048575 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.27692307692307694, "acc_stderr": 0.022688042352424994, "acc_norm": 0.27692307692307694, "acc_norm_stderr": 0.022688042352424994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712166, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712166 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2689075630252101, "acc_stderr": 0.028801392193631276, "acc_norm": 0.2689075630252101, "acc_norm_stderr": 0.028801392193631276 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2119205298013245, "acc_stderr": 0.033367670865679766, "acc_norm": 0.2119205298013245, "acc_norm_stderr": 0.033367670865679766 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3798165137614679, "acc_stderr": 0.020808825617866244, "acc_norm": 0.3798165137614679, "acc_norm_stderr": 0.020808825617866244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2037037037037037, "acc_stderr": 0.027467401804057986, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.027467401804057986 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03460228327239171, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03460228327239171 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5232067510548524, "acc_stderr": 0.032512152011410174, "acc_norm": 0.5232067510548524, "acc_norm_stderr": 0.032512152011410174 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.47085201793721976, "acc_stderr": 0.03350073248773404, "acc_norm": 0.47085201793721976, "acc_norm_stderr": 0.03350073248773404 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3435114503816794, "acc_stderr": 0.041649760719448786, "acc_norm": 0.3435114503816794, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.4132231404958678, "acc_stderr": 0.04495087843548408, "acc_norm": 0.4132231404958678, "acc_norm_stderr": 0.04495087843548408 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4351851851851852, "acc_stderr": 0.04792898170907061, "acc_norm": 0.4351851851851852, "acc_norm_stderr": 0.04792898170907061 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3128834355828221, "acc_stderr": 0.036429145782924055, "acc_norm": 0.3128834355828221, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.044328040552915185, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.044328040552915185 }, "harness|hendrycksTest-management|5": { "acc": 0.27184466019417475, "acc_stderr": 0.044052680241409216, "acc_norm": 0.27184466019417475, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.44871794871794873, "acc_stderr": 0.032583346493868806, "acc_norm": 0.44871794871794873, "acc_norm_stderr": 0.032583346493868806 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.41507024265644954, "acc_stderr": 0.017620137003655268, "acc_norm": 0.41507024265644954, "acc_norm_stderr": 0.017620137003655268 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.3988439306358382, "acc_stderr": 0.026362437574546534, "acc_norm": 0.3988439306358382, "acc_norm_stderr": 0.026362437574546534 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.014400296429225596, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.014400296429225596 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3137254901960784, "acc_stderr": 0.026568921015457152, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.026568921015457152 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3408360128617363, "acc_stderr": 0.026920841260776162, "acc_norm": 0.3408360128617363, "acc_norm_stderr": 0.026920841260776162 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.33641975308641975, "acc_stderr": 0.026289734945952926, "acc_norm": 0.33641975308641975, "acc_norm_stderr": 0.026289734945952926 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.26595744680851063, "acc_stderr": 0.026358065698880596, "acc_norm": 0.26595744680851063, "acc_norm_stderr": 0.026358065698880596 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.33116036505867014, "acc_stderr": 0.012020128195985757, "acc_norm": 0.33116036505867014, "acc_norm_stderr": 0.012020128195985757 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2867647058823529, "acc_stderr": 0.02747227447323382, "acc_norm": 0.2867647058823529, "acc_norm_stderr": 0.02747227447323382 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3235294117647059, "acc_stderr": 0.018926082916083393, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.018926082916083393 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.37272727272727274, "acc_stderr": 0.04631381319425464, "acc_norm": 0.37272727272727274, "acc_norm_stderr": 0.04631381319425464 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24489795918367346, "acc_stderr": 0.027529637440174913, "acc_norm": 0.24489795918367346, "acc_norm_stderr": 0.027529637440174913 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4228855721393035, "acc_stderr": 0.034932317774212816, "acc_norm": 0.4228855721393035, "acc_norm_stderr": 0.034932317774212816 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03811079669833531, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03811079669833531 }, "harness|truthfulqa:mc|0": { "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520681, "mc2": 0.37298301233557335, "mc2_stderr": 0.014007983938605419 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,151
[ [ -0.050567626953125, -0.057342529296875, 0.0164947509765625, 0.0171051025390625, -0.009002685546875, -0.00707244873046875, 0.00848388671875, -0.01410675048828125, 0.040374755859375, -0.0033016204833984375, -0.035400390625, -0.046600341796875, -0.0292510986328125,...
hmao/rule_learning_data_v0
2023-10-09T22:28:43.000Z
[ "region:us" ]
hmao
null
null
0
0
2023-10-09T22:25:38
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: rule dtype: string - name: task_name dtype: string - name: configuration dtype: string - name: description dtype: string - name: filepath dtype: string - name: old_instruction dtype: string - name: prompt dtype: string splits: - name: train num_bytes: 6226117 num_examples: 2009 download_size: 2213175 dataset_size: 6226117 --- # Dataset Card for "rule_learning_data_v0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
685
[ [ -0.0247802734375, -0.0135955810546875, 0.0170440673828125, 0.006771087646484375, -0.0164031982421875, -0.0198211669921875, 0.0205841064453125, -0.0004200935363769531, 0.05084228515625, 0.037872314453125, -0.06475830078125, -0.06719970703125, -0.0333251953125, ...
aurob96/your-dataset-name
2023-10-10T15:17:57.000Z
[ "region:us" ]
aurob96
null
null
0
0
2023-10-09T22:31:45
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
hmao/rule-sql-v1
2023-10-09T22:43:35.000Z
[ "region:us" ]
hmao
null
null
0
0
2023-10-09T22:43:20
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: response dtype: string - name: source dtype: string - name: text dtype: string - name: rule dtype: string - name: software dtype: string splits: - name: train num_bytes: 863452252 num_examples: 262208 download_size: 225135160 dataset_size: 863452252 --- # Dataset Card for "rule-sql-v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
663
[ [ -0.0283203125, -0.0357666015625, 0.01263427734375, 0.0233154296875, -0.03704833984375, -0.036285400390625, 0.027679443359375, -0.0014028549194335938, 0.06573486328125, 0.055938720703125, -0.08551025390625, -0.0625, -0.0207061767578125, -0.0233917236328125, ...
niyar/test-tree-rings
2023-10-09T23:05:28.000Z
[ "region:us" ]
niyar
null
null
0
0
2023-10-09T22:46:02
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b
2023-10-29T09:59:25.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T23:41:54
--- pretty_name: Evaluation run of PocketDoc/Dans-TotSirocco-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T12:54:48.005243](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-23T12-54-48.005243.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.44997902684563756,\n\ \ \"em_stderr\": 0.00509477973209699,\n \"f1\": 0.49544777684563845,\n\ \ \"f1_stderr\": 0.00490923385938236,\n \"acc\": 0.45978722729484023,\n\ \ \"acc_stderr\": 0.01042644341108249\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.44997902684563756,\n \"em_stderr\": 0.00509477973209699,\n\ \ \"f1\": 0.49544777684563845,\n \"f1_stderr\": 0.00490923385938236\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1326762699014405,\n \ \ \"acc_stderr\": 0.009343929131442216\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722764\n\ \ }\n}\n```" repo_url: https://huggingface.co/PocketDoc/Dans-TotSirocco-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|arc:challenge|25_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T12_54_48.005243 path: - '**/details_harness|drop|3_2023-10-23T12-54-48.005243.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T12-54-48.005243.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T12_54_48.005243 path: - '**/details_harness|gsm8k|5_2023-10-23T12-54-48.005243.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T12-54-48.005243.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hellaswag|10_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T23_41_30.846721 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T23-41-30.846721.parquet' - split: 2023_10_10T03_08_42.670420 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T12_54_48.005243 path: - '**/details_harness|winogrande|5_2023-10-23T12-54-48.005243.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T12-54-48.005243.parquet' - config_name: results data_files: - split: 2023_10_09T23_41_30.846721 path: - results_2023-10-09T23-41-30.846721.parquet - split: 2023_10_10T03_08_42.670420 path: - results_2023-10-10T03-08-42.670420.parquet - split: 2023_10_23T12_54_48.005243 path: - results_2023-10-23T12-54-48.005243.parquet - split: latest path: - results_2023-10-23T12-54-48.005243.parquet --- # Dataset Card for Evaluation run of PocketDoc/Dans-TotSirocco-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PocketDoc/Dans-TotSirocco-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T12:54:48.005243](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-23T12-54-48.005243.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.44997902684563756, "em_stderr": 0.00509477973209699, "f1": 0.49544777684563845, "f1_stderr": 0.00490923385938236, "acc": 0.45978722729484023, "acc_stderr": 0.01042644341108249 }, "harness|drop|3": { "em": 0.44997902684563756, "em_stderr": 0.00509477973209699, "f1": 0.49544777684563845, "f1_stderr": 0.00490923385938236 }, "harness|gsm8k|5": { "acc": 0.1326762699014405, "acc_stderr": 0.009343929131442216 }, "harness|winogrande|5": { "acc": 0.7868981846882399, "acc_stderr": 0.011508957690722764 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
52,865
[ [ -0.0279083251953125, -0.046478271484375, 0.025970458984375, 0.017486572265625, -0.016357421875, 0.003631591796875, -0.0309295654296875, -0.00611114501953125, 0.0284271240234375, 0.042999267578125, -0.05084228515625, -0.07440185546875, -0.05145263671875, 0.01...
ABD7667/fgffgfgf
2023-10-09T23:49:35.000Z
[ "region:us" ]
ABD7667
null
null
0
0
2023-10-09T23:49:35
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
AsAHuman/AnomalyCLIP
2023-10-10T10:58:04.000Z
[ "region:us" ]
AsAHuman
null
null
0
0
2023-10-09T23:51:01
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
KenDoStudio/MMVCServerSIO-demo
2023-10-10T01:02:42.000Z
[ "region:us" ]
KenDoStudio
null
null
0
0
2023-10-10T00:38:24
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
LSVR/Vetroquis
2023-10-10T02:45:50.000Z
[ "region:us" ]
LSVR
null
null
0
0
2023-10-10T02:35:04
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
sheepy928/purdue_reddit_posts_2017_2022
2023-10-10T02:49:57.000Z
[ "region:us" ]
sheepy928
null
null
0
0
2023-10-10T02:49:54
--- dataset_info: features: - name: title dtype: string - name: selftext dtype: string - name: created_utc dtype: timestamp[ns] - name: url dtype: string - name: author dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 25865572 num_examples: 78849 download_size: 15617426 dataset_size: 25865572 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "purdue_reddit_posts_2017_2022" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
658
[ [ -0.026031494140625, -0.0211334228515625, 0.0357666015625, 0.038482666015625, -0.020751953125, -0.0205535888671875, 0.012969970703125, -0.017120361328125, 0.057037353515625, 0.02081298828125, -0.059356689453125, -0.062744140625, -0.03857421875, -0.00051403045...
sheepy928/Purdue_reddit_posts_1500_unlabelled
2023-10-10T02:50:03.000Z
[ "region:us" ]
sheepy928
null
null
0
0
2023-10-10T02:50:02
--- dataset_info: features: - name: title dtype: string - name: selftext dtype: string - name: created_utc dtype: timestamp[ns] - name: url dtype: string - name: author dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 504948 num_examples: 1500 download_size: 321568 dataset_size: 504948 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Purdue_reddit_posts_1500" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
646
[ [ -0.034942626953125, -0.025634765625, 0.03131103515625, 0.054046630859375, -0.0110626220703125, -0.0209503173828125, 0.01444244384765625, -0.006420135498046875, 0.0623779296875, 0.01837158203125, -0.05242919921875, -0.058135986328125, -0.041290283203125, 0.00...
LSVR1806/Vetroquis
2023-10-10T03:03:43.000Z
[ "region:us" ]
LSVR1806
null
null
0
0
2023-10-10T03:02:52
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
iara-project/test_split_with_embeddings_bert_base_portuguese
2023-10-10T03:04:17.000Z
[ "region:us" ]
iara-project
null
null
0
0
2023-10-10T03:04:00
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: news_id dtype: int64 - name: embeddings dtype: int64 - name: sentence dtype: string - name: category dtype: string splits: - name: test num_bytes: 588008891 num_examples: 176114 download_size: 365796407 dataset_size: 588008891 --- # Dataset Card for "test_split_with_embeddings_bert_base_portuguese" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
594
[ [ -0.050140380859375, -0.0391845703125, -0.0025539398193359375, 0.03338623046875, -0.035675048828125, 0.0093994140625, -0.002246856689453125, -0.00804901123046875, 0.06634521484375, 0.02386474609375, -0.0499267578125, -0.051788330078125, -0.049041748046875, -0...
mmhzlrj/Genealogy
2023-10-10T03:34:14.000Z
[ "language:zh", "license:apache-2.0", "region:us" ]
mmhzlrj
null
null
0
0
2023-10-10T03:20:08
--- license: apache-2.0 language: - zh --- 数据集包含了一本族谱的封面和164页内容,是竖版的中文简体和繁体字的组合。 The dataset contains the cover and 164 pages of a family tree, which is a combination of simplified and traditional Chinese characters in a vertical version.
238
[ [ -0.026519775390625, -0.01171875, -0.0162811279296875, 0.03778076171875, -0.0455322265625, -0.01325225830078125, 0.028717041015625, -0.0229339599609375, 0.0262908935546875, 0.05438232421875, -0.04974365234375, -0.042877197265625, -0.034271240234375, 0.0019483...
SRGui/autotrain-data-resnet50_test
2023-10-10T05:31:27.000Z
[ "task_categories:image-classification", "region:us" ]
SRGui
null
null
0
0
2023-10-10T03:27:16
--- task_categories: - image-classification --- # AutoTrain Dataset for project: resnet50_test ## Dataset Description This dataset has been automatically processed by AutoTrain for project resnet50_test. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<1920x1920 RGB PIL image>", "target": 2 }, { "image": "<1080x721 RGB PIL image>", "target": 2 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['000', '005', '033'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 2244 | | valid | 564 |
952
[ [ -0.04638671875, 0.01520538330078125, -0.004581451416015625, 0.018707275390625, -0.0196380615234375, 0.01325225830078125, -0.0107421875, -0.023712158203125, 0.0029506683349609375, 0.02935791015625, -0.05120849609375, -0.038665771484375, -0.0254974365234375, 0...
ilyas3141/ilias_test16
2023-10-10T03:45:39.000Z
[ "region:us" ]
ilyas3141
null
null
0
0
2023-10-10T03:45:39
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b
2023-10-24T16:13:42.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T04:05:21
--- pretty_name: Evaluation run of PocketDoc/Dans-AdventurousWinds-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T16:13:28.760766](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-24T16-13-28.760766.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32791526845637586,\n\ \ \"em_stderr\": 0.004807646038011011,\n \"f1\": 0.3764691694630872,\n\ \ \"f1_stderr\": 0.004686966609320671,\n \"acc\": 0.46954983116649207,\n\ \ \"acc_stderr\": 0.010810156337777745\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.32791526845637586,\n \"em_stderr\": 0.004807646038011011,\n\ \ \"f1\": 0.3764691694630872,\n \"f1_stderr\": 0.004686966609320671\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15693707354056102,\n \ \ \"acc_stderr\": 0.010019246595616167\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\ \ }\n}\n```" repo_url: https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T16_13_28.760766 path: - '**/details_harness|drop|3_2023-10-24T16-13-28.760766.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T16-13-28.760766.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T16_13_28.760766 path: - '**/details_harness|gsm8k|5_2023-10-24T16-13-28.760766.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T16-13-28.760766.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T04_04_57.551374 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T16_13_28.760766 path: - '**/details_harness|winogrande|5_2023-10-24T16-13-28.760766.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T16-13-28.760766.parquet' - config_name: results data_files: - split: 2023_10_10T04_04_57.551374 path: - results_2023-10-10T04-04-57.551374.parquet - split: 2023_10_24T16_13_28.760766 path: - results_2023-10-24T16-13-28.760766.parquet - split: latest path: - results_2023-10-24T16-13-28.760766.parquet --- # Dataset Card for Evaluation run of PocketDoc/Dans-AdventurousWinds-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T16:13:28.760766](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-24T16-13-28.760766.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.32791526845637586, "em_stderr": 0.004807646038011011, "f1": 0.3764691694630872, "f1_stderr": 0.004686966609320671, "acc": 0.46954983116649207, "acc_stderr": 0.010810156337777745 }, "harness|drop|3": { "em": 0.32791526845637586, "em_stderr": 0.004807646038011011, "f1": 0.3764691694630872, "f1_stderr": 0.004686966609320671 }, "harness|gsm8k|5": { "acc": 0.15693707354056102, "acc_stderr": 0.010019246595616167 }, "harness|winogrande|5": { "acc": 0.7821625887924231, "acc_stderr": 0.011601066079939324 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,718
[ [ -0.0302886962890625, -0.053070068359375, 0.0238800048828125, 0.01494598388671875, -0.0131988525390625, 0.0089263916015625, -0.0308380126953125, -0.008636474609375, 0.02874755859375, 0.042510986328125, -0.051422119140625, -0.06884765625, -0.048004150390625, 0...
ilyas3141/ilias_test17
2023-10-10T06:26:00.000Z
[ "region:us" ]
ilyas3141
null
null
0
0
2023-10-10T04:16:31
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
unoooo/ko-alpaca
2023-10-10T04:30:34.000Z
[ "region:us" ]
unoooo
null
null
0
0
2023-10-10T04:30:34
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
open-llm-leaderboard/details_maywell__Synatra-V0.1-7B
2023-10-23T11:25:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T04:31:22
--- pretty_name: Evaluation run of maywell/Synatra-V0.1-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [maywell/Synatra-V0.1-7B](https://huggingface.co/maywell/Synatra-V0.1-7B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-V0.1-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T11:25:13.204412](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B/blob/main/results_2023-10-23T11-25-13.204412.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32246224832214765,\n\ \ \"em_stderr\": 0.004786806140711669,\n \"f1\": 0.3963055788590608,\n\ \ \"f1_stderr\": 0.004634063813539812,\n \"acc\": 0.46089483255174657,\n\ \ \"acc_stderr\": 0.011702308149823175\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.32246224832214765,\n \"em_stderr\": 0.004786806140711669,\n\ \ \"f1\": 0.3963055788590608,\n \"f1_stderr\": 0.004634063813539812\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19408642911296436,\n \ \ \"acc_stderr\": 0.010893918308192417\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453932\n\ \ }\n}\n```" repo_url: https://huggingface.co/maywell/Synatra-V0.1-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|arc:challenge|25_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T04-30-58.971713.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T11_25_13.204412 path: - '**/details_harness|drop|3_2023-10-23T11-25-13.204412.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T11-25-13.204412.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T11_25_13.204412 path: - '**/details_harness|gsm8k|5_2023-10-23T11-25-13.204412.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T11-25-13.204412.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hellaswag|10_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-30-58.971713.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T04_30_58.971713 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T04-30-58.971713.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T04-30-58.971713.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T11_25_13.204412 path: - '**/details_harness|winogrande|5_2023-10-23T11-25-13.204412.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T11-25-13.204412.parquet' - config_name: results data_files: - split: 2023_10_10T04_30_58.971713 path: - results_2023-10-10T04-30-58.971713.parquet - split: 2023_10_23T11_25_13.204412 path: - results_2023-10-23T11-25-13.204412.parquet - split: latest path: - results_2023-10-23T11-25-13.204412.parquet --- # Dataset Card for Evaluation run of maywell/Synatra-V0.1-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/maywell/Synatra-V0.1-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [maywell/Synatra-V0.1-7B](https://huggingface.co/maywell/Synatra-V0.1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-V0.1-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T11:25:13.204412](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B/blob/main/results_2023-10-23T11-25-13.204412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.32246224832214765, "em_stderr": 0.004786806140711669, "f1": 0.3963055788590608, "f1_stderr": 0.004634063813539812, "acc": 0.46089483255174657, "acc_stderr": 0.011702308149823175 }, "harness|drop|3": { "em": 0.32246224832214765, "em_stderr": 0.004786806140711669, "f1": 0.3963055788590608, "f1_stderr": 0.004634063813539812 }, "harness|gsm8k|5": { "acc": 0.19408642911296436, "acc_stderr": 0.010893918308192417 }, "harness|winogrande|5": { "acc": 0.7277032359905288, "acc_stderr": 0.012510697991453932 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,586
[ [ -0.02008056640625, -0.039703369140625, 0.01467132568359375, 0.006870269775390625, -0.0093841552734375, 0.0014638900756835938, -0.026519775390625, -0.01531219482421875, 0.0304718017578125, 0.033477783203125, -0.053466796875, -0.07171630859375, -0.050933837890625,...
chrisdosheavymetal/RENAN
2023-10-10T04:50:59.000Z
[ "region:us" ]
chrisdosheavymetal
null
null
0
0
2023-10-10T04:47:24
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
michaelginn/childes_phones
2023-10-10T22:57:03.000Z
[ "region:us" ]
michaelginn
null
null
0
0
2023-10-10T04:49:38
--- dataset_info: features: - name: line dtype: string - name: file dtype: string - name: ipa dtype: string splits: - name: train num_bytes: 1979606 num_examples: 28466 download_size: 932024 dataset_size: 1979606 --- # Dataset Card for "childes_phones" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
421
[ [ -0.035980224609375, 0.00803375244140625, -0.00391387939453125, 0.00839996337890625, -0.0163726806640625, -0.00241851806640625, 0.03509521484375, -0.0074005126953125, 0.050384521484375, 0.021759033203125, -0.069580078125, -0.04205322265625, -0.031768798828125, ...
dparksports/embedded_faqs_medicare
2023-10-10T05:05:42.000Z
[ "region:us" ]
dparksports
null
null
0
0
2023-10-10T04:59:05
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
songys/ChatbotData
2023-10-10T05:24:31.000Z
[ "license:cc-by-sa-4.0", "region:us" ]
songys
null
null
0
0
2023-10-10T05:22:01
--- license: cc-by-sa-4.0 --- # Chatbot_data. Chatbot_data_for_Korean v1.0 ## Data description. 인공데이터입니다. 일부 이별과 관련된 질문에서 다음카페 "사랑보다 아름다운 실연( http://cafe116.daum.net/_c21_/home?grpid=1bld )"에서 자주 나오는 이야기들을 참고하여 제작하였습니다. 가령 "이별한 지 열흘(또는 100일) 되었어요"라는 질문에 챗봇이 위로한다는 취지로 답변을 작성하였습니다. 1. 챗봇 트레이닝용 문답 페어 11,876개 2. 일상다반사 0, 이별(부정) 1, 사랑(긍정) 2로 레이블링 #인용 Youngsook Song.(2018). Chatbot_data_for_Korean v1.0)[Online]. Available : https://github.com/songys/Chatbot_data (downloaded 2022. June. 29.)
577
[ [ -0.00853729248046875, -0.0645751953125, -0.004116058349609375, 0.05810546875, -0.036956787109375, 0.011993408203125, -0.00592041015625, -0.011871337890625, 0.0478515625, 0.03826904296875, -0.053131103515625, -0.05548095703125, -0.0293426513671875, 0.00347328...
SRGui/simple_cn_food_demo
2023-10-10T05:46:24.000Z
[ "task_categories:image-classification", "region:us" ]
SRGui
null
null
0
0
2023-10-10T05:38:33
--- task_categories: - image-classification --- # AutoTrain Dataset for project: demo-resnet50-test ## Dataset Description This dataset has been automatically processed by AutoTrain for project demo-resnet50-test. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<600x600 RGB PIL image>", "target": 0 }, { "image": "<600x799 RGB PIL image>", "target": 1 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['000', '005', '033'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 288 | | valid | 75 |
957
[ [ -0.047698974609375, 0.01529693603515625, -0.004192352294921875, 0.01403045654296875, -0.0216522216796875, 0.01404571533203125, -0.01143646240234375, -0.0222930908203125, 0.0024013519287109375, 0.0283050537109375, -0.052154541015625, -0.042144775390625, -0.025878...
songys/Ko_humane_right_copus
2023-10-10T05:47:59.000Z
[ "license:cc-by-sa-3.0", "region:us" ]
songys
null
null
1
0
2023-10-10T05:45:39
--- license: cc-by-sa-3.0 --- # HRC: Building a human rights corpus for interactive generation models #대화형 생성 모델을 위한 인권코퍼스 구축 ## 참조 데이터 - 대한민국 [국가인권위원회](https://case.humanrights.go.kr/dici/diciList.do)의 결정례와 상담사례 참조 - 문체 변경과 질의 응답으로 변경하기 위해서 전후 맥락을 고려한 예시문을 만들고 GPT-3.5-turbo 을 이용하여 원샷 학습후 문답 생성 ## 데이터 구조 - 데이터 구조 : source_copus---counsel.jsonl ---decision.jsonl humane_right_copus_v1.jsonl ## 프롬프트 예시 ``` [상담례 prompt] 주어진 상담 문서를 자연스러운 질문, 답변 형태로 변형해 주세요. 답변이 끝나면 '#####'를 작성해 주세요. 반드시 원본의 답변 내용을 기반으로 답변해야 합니다. 질문은 최대한 간결하게 작성해 주세요. ##### 상담 내용: 시청 앞 광장에서 노동조합이 기자회견을 하고 있습니다. 그런데 경찰 세 개 중대가 기자회견장을 에워싸고 집회를 해산하라는 방송을 계속 내보내고 있습니다. 우리는 기자회견을 하고 있으며 설령 집회라 하더라도 평화적으로 하는 집회에 대해 해산 명령을 하는 것은 부당합니다. 원본 상담 답변: 신고를 하지 아니하였다는 이유만으로 개최가 허용되지 않는 집회 내지 시위라고 단정할 수 없습니다. 기자회견이 개인 발언 및 구호 제창 등 미신고 옥외집회의 성격을 띠었다고 하더라도 헌법상 표현의 자유와 집회 시위의 자유라는 민주주의의 근간을 이루는 국민의 기본권이 최대한 보장되도록 해야 합니다. 참고로 대법원(2012도9776)은 “기자회견이 집회에 해당한다 하더라도, 해산명령불응죄가 성립하려면 집회로 인해 타인의 법익이나 공공의 안녕질서에 대한 직접적인 위험이 명백하게 초래된 경우에 해당”하는 것으로 판단한 바 있습니다. 다만, 기자회견이라는 이름으로 개최되고, 일부 기자회견의 성격이 가미되었다고 하더라도 집회로서의 성격을 부인할 수 없다면 신고의 의무까지 면제된다고 보기는 어려울 수 있습니다. 간결한 질문: 시청 앞 광장에서 노동조합이 기자회견을 진행 중입니다. 그런데 경찰이 기자회견장을 둘러싸고 해산하라는 방송을 내보내고 있습니다. 이런 상황에서 경찰의 해산 명령은 정당한 것인가요? 답변: 신고를 하지 않은 집회가 허용되지 않는다는 단순한 이유만으로 기자회견을 미신고 옥외집회로 판단하는 것은 부적절합니다. 기자회견도 표현의 자유와 집회 시위의 자유, 민주주의의 핵심적인 국민의 기본권에 해당하므로 최대한 보장되어야 합니다. 대법원의 판례(2012도9776)에 따르면, "기자회견이 집회에 해당한다 해도, 그 집회로 인해 다른 사람의 법적 이익이나 공공 안녕질서에 대한 명백한 위험이 발생해야 해산명령불응죄가 성립한다"고 하였습니다. 그렇지만, 기자회견이 집회의 성격을 함께 갖는 경우 신고 의무를 면제받는 것은 아닙니다. ##### 상담 내용: {} 원본 상담 답변: {} 간결한 질문: [결정례 prompt] 주어진 요청, 근거, 결론을 참고하여, 새로운 법률 관련 질문과 답변을 만들어주세요. 답변은 500자 이내로 작성해주세요. 반드시 주어진 자료의 사실을 활용해야 합니다. 답변 끝나면, '*****' 를 작성해주세요. ***** 요청: 진정인은 사회복무요원으로 피진정기관에서 업무 보조 및 폐의약품 수거 일을 하였다. 2021. 7. 26.폐의약품 수거를 위해 진정인의 자전거를 타고 인도 위를 지나다 뛰어오던 행인을 치는 교통사고를 내어 벌금형40만을 선고받았다.교통사고 피해자는 전치 6주 진단을 받았고, 1,30만원가량의 국가배상 청구를 신청한 상태이며, 피진정인은 진정인에게중과실 책임이 있기 때문에 구상권을 청구할 수 있다고 한다.진정인이 인도에서 자전거를 운행한 것은 잘못이지만, 공무 중 발생한 사고에 대해 사회복무요원이 모든 책임을 지는 것은 부당하다. 피진정기관이 진정인에게 구상권을 행사하지 않도록 도와주기 바란다. 근거: 1. 진정인이 자전거로 인도를 횡단하는 등 중과실 책임이 있긴 하나, 사전에 피진정인이 복무관리기관의 장으로서 주의 의무를 다하였다면 진정인이 교통사고에 이르지 않았을 수 있고, 20대 초반의 사회초년생이 벌금을 비롯해 고액의 손해배상 부담을 지는 상황을 피할 수 있었을 것임. 2. 사회복무요원은 병역의무를 이행하고자 국가기관, 지방자치단체 등에 소속되어 공익목적 업무 수행 차원에서 사회서비스 및 행정업무에 복무하고 있는 자이고, 「병역법」 제31조에서도 사회복무요원의 직무상 행위를 공무수행으로 규정하고 있는 바, 공무수행 중 발생한 사고에 대하여 사회복무요원을 공무원 등과 달리 대우할 합리적인 이유가 없음. 따라서 법적 지원체계 마련이 필요함. 3. 진정 취지가 구상권을 행사하지 않도록 해달라는 요청이므로, 이는 헌법상 권리 침해 여부를 판단할 사안이 아님. 결론: 주문 1 : 1. 광명시장에게, 안전사고 위험이 큰 분야에 소속 사회복무요원을 가급적 배치하지 않도록 하고, 배치가 불가피할 때에는 사전에 안전교육을 철저히 실시하도록 의견을 표명합니다. 주문 2 : 2. 병무청장에게, 사회복무요원의 복무 관련 사고 발생 시 초기단계부터 지원하고 민형사상 부담을 최소화하는 지원체계를 마련하도록 의견을 표명합니다. 주문 3 : 3. 이 사건 진정은 각하합니다. 위를 참고한 새로운 질문과 답변 질문: 사회복무요원이 공무 중 발생한 교통사고로 인한 피해자에 대한 구상권을 행사할 수 있는지에 대한 법적 근거가 어떻게 되는지 알려주세요. 답변: 사회복무요원이 공무 중 발생한 교통사고로 인한 피해자에 대한 구상권 여부는 법적으로 인정됩니다. 이는 병역법 제31조에 따라 사회복무요원의 직무를 공무수행으로 규정하고 있으며, 중과실 책임이 있더라도 사전에 안전 조치를 취하거나 교통법규를 준수해 사고를 예방할 의무가 있음을 의미합니다. 따라서 피해자는 피사회복무요원에 대한 손해배상을 청구할 수 있습니다. ***** 요청: {} 근거: {} 결론: {} 위를 참고한 새로운 질문과 답변 질문: ``` ## 인용 ``` @inproceedings{song2023}, author = {송영숙 and 심상진 and 김성현}, title = {대화형 생성 모델을 위한 인권 코퍼스 구축}, booktitle = {한글 및 한국어 정보처리 학술대회 발표 예정)}, year = {2023}, publisher = {한글 및 한국어 정보처리 학회} } ```
3,422
[ [ -0.045623779296875, -0.0362548828125, 0.0180816650390625, 0.0246429443359375, -0.0302276611328125, 0.0108489990234375, 0.0285797119140625, -0.0189208984375, 0.0430908203125, 0.031951904296875, -0.02001953125, -0.03582763671875, -0.038238525390625, 0.01111602...
SRGui/autotrain-data-tete
2023-10-10T06:00:31.000Z
[ "region:us" ]
SRGui
null
null
0
0
2023-10-10T06:00:31
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
casey-martin/qald_9_plus
2023-10-10T06:06:19.000Z
[ "task_categories:table-question-answering", "task_categories:text2text-generation", "language:ba", "language:be", "language:de", "language:en", "language:fr", "language:hy", "language:lt", "language:ru", "language:uk", "license:cc-by-4.0", "semantic web", "sparql", "wikidata", "dbpedia...
casey-martin
null
null
0
0
2023-10-10T06:00:44
--- license: cc-by-4.0 task_categories: - table-question-answering - text2text-generation language: - ba - be - de - en - fr - hy - lt - ru - uk tags: - semantic web - sparql - wikidata - dbpedia pretty_name: QALD 9+ --- # QALD-9-plus Dataset Description QALD-9-plus is the dataset for Knowledge Graph Question Answering (KGQA) based on well-known [QALD-9](https://github.com/ag-sc/QALD/tree/master/9/data). QALD-9-plus enables to train and test KGQA systems over DBpedia and Wikidata using questions in 9 different languages: English, German, Russian, French, Armenian, Belarusian, Lithuanian, Bashkir, and Ukrainian. Some of the questions have several alternative writings in particular languages which enables to evaluate the robustness of KGQA systems and train paraphrasing models. As the questions' translations were provided by native speakers, they are considered as "gold standard", therefore, machine translation tools can be trained and evaluated on the dataset. # Dataset Statistics | | en | de | fr | ru | uk | lt | be | ba | hy | # questions DBpedia | # questions Wikidata | |-------|:---:|:---:|:--:|:----:|:---:|:---:|:---:|:---:|:--:|:-----------:|:-----------:| | Train | 408 | 543 | 260 | 1203 | 447 | 468 | 441 | 284 | 80 | 408 | 371 | | Test | 150 | 176 | 26 | 348 | 176 | 186 | 155 | 117 | 20 | 150 | 136 | Given the numbers, it is obvious that some of the languages are covered more than once i.e., there is more than one translation for a particular question. For example, there are 1203 Russian translations available while only 408 unique questions exist in the training subset (i.e., 2.9 Russian translations per one question). The availability of such parallel corpora enables the researchers, developers and other dataset users to address the paraphrasing task. # Evaluation We used [GERBIL](https://github.com/dice-group/gerbil/) system for the evaluation of the dataset. The detailed information for the experiments is available at the individual link (click the value in the cells). ## Wikidata ### QAnswer | | en | de | ru | fr | |-----|----|----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110010001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180000)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180002)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110010007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180008)| ### DeepPavlov | | en | ru | |-----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110080010)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180003)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110090001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180009)| ### Platypus | | en | fr | |-----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110110004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180004)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110110006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112180010)| ## DBpedia ### QAnswer | | en | de | ru | fr | |-----|----|----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110120004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190000)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190001)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190002)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202110130002)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190003)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190004)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190005)| ## Wikidata Original Translations ### QAnswer | | de | ru | fr | |-----|----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190006)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190007)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190008)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190009)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190010)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190011)| ### DeepPavlov | | ru | |-----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190012)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190014)| ### Platypus | | fr | |-----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190013)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190015)| ## DBpedia Original Translations ### QAnswer | | de | ru | fr | |-----|----|----|----| |Test |[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190016)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190017)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190018)| |Train|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190019)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190020)|[link](http://gerbil-qa.aksw.org/gerbil/experiment?id=202112190021)| # Cite ```bibtex @inproceedings{perevalov2022qald9plus, author={Perevalov, Aleksandr and Diefenbach, Dennis and Usbeck, Ricardo and Both, Andreas}, booktitle={2022 IEEE 16th International Conference on Semantic Computing (ICSC)}, title={QALD-9-plus: A Multilingual Dataset for Question Answering over DBpedia and Wikidata Translated by Native Speakers}, year={2022}, pages={229-234}, doi={10.1109/ICSC52841.2022.00045} } ``` # Useful Links * ArXiv [link](https://arxiv.org/abs/2202.00120) * Papers with Code: [Paper](https://paperswithcode.com/paper/qald-9-plus-a-multilingual-dataset-for-1), [Dataset](https://paperswithcode.com/dataset/qald-9-plus) * Video presentation on YouTube: https://youtu.be/W1w7CJTV48c * Presentation [slides](https://drive.google.com/file/d/1cDphq4DeSiZr-WBvdwu34rcxQ0aP4q95/view?usp=sharing) * Google Colab [notebook](https://colab.research.google.com/drive/1eWsQoIaeT9_vii1v3PVU04Rms4EoyLAh?usp=sharing) # Licence [![CC BY 4.0][cc-by-shield]][cc-by] This work is licensed under a [Creative Commons Attribution 4.0 International License][cc-by]. [![CC BY 4.0][cc-by-image]][cc-by] [cc-by]: http://creativecommons.org/licenses/by/4.0/ [cc-by-image]: https://i.creativecommons.org/l/by/4.0/88x31.png [cc-by-shield]: https://img.shields.io/badge/License-CC%20BY%204.0-lightgrey.svg # Dataset Metadata The following table is necessary for this dataset to be indexed by search engines such as <a href="https://g.co/datasetsearch">Google Dataset Search</a>. <div itemscope itemtype="http://schema.org/Dataset"> <table> <tr> <th>property</th> <th>value</th> </tr> <tr> <td>name</td> <td><code itemprop="name">QALD-9-plus: A Multilingual Dataset for Question Answering over DBpedia and Wikidata Translated by Native Speakers</code></td> </tr> <tr> <td>alternateName</td> <td><code itemprop="alternateName">QALD-9-plus</code></td> </tr> <tr> <td>url</td> <td><code itemprop="url">https://github.com/Perevalov/qald_9_plus/tree/main/data</code></td> </tr> <tr> <td>description</td> <td><code itemprop="description">QALD-9-Plus is the dataset for Knowledge Graph Question Answering (KGQA) based on well-known QALD-9.<br/> QALD-9-Plus enables to train and test KGQA systems over DBpedia and Wikidata using questions in 9 different languages: English, German, Russian, French, Armenian, Belarusian, Lithuanian, Bashkir, and Ukrainian.<br/> Some of the questions have several alternative writings in particular languages which enables to evaluate the robustness of KGQA systems and train paraphrasing models.<br/> As the questions' translations were provided by native speakers, they are considered as "gold standard", therefore, machine translation tools can be trained and evaluated on the dataset.</code></td> </tr> <tr> <td>license</td> <td> <div itemscope itemtype="http://schema.org/CreativeWork" itemprop="license"> <table> <tr> <th>property</th> <th>value</th> </tr> <tr> <td>name</td> <td><code itemprop="name">CC-BY-4.0</code></td> </tr> <tr> <td>url</td> <td><code itemprop="url">https://creativecommons.org/licenses/by/4.0/</code></td> </tr> </table> </div> </td> </tr> <tr> <td>citation</td> <td><code itemprop="citation">Perevalov, Aleksandr, Diefenbach, Diefenback, Usbeck, Ricardo, Both, Andreas: QALD-9-plus: A multilingual dataset for question answering over DBpedia and Wikidata translated by native speakers. In: 2022 IEEE 16th International Conference on Semantic Computing (ICSC). IEEE (2022)</code></td> </tr> </table> </div>
9,220
[ [ -0.0618896484375, -0.0311279296875, 0.0181427001953125, 0.004917144775390625, -0.0178375244140625, -0.0110321044921875, -0.0088958740234375, -0.03521728515625, 0.0267791748046875, 0.00684356689453125, -0.0413818359375, -0.040802001953125, -0.022735595703125, ...
open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2
2023-10-10T06:05:44.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T06:04:45
--- pretty_name: Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B-v2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [krevas/LDCC-Instruct-Llama-2-ko-13B-v2](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T06:04:26.663902](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2/blob/main/results_2023-10-10T06-04-26.663902.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45958883488115343,\n\ \ \"acc_stderr\": 0.034511714778603424,\n \"acc_norm\": 0.4636864222606454,\n\ \ \"acc_norm_stderr\": 0.03449288105358144,\n \"mc1\": 0.2668298653610771,\n\ \ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39776112473254976,\n\ \ \"mc2_stderr\": 0.013677730634490858\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\ \ \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6105357498506274,\n\ \ \"acc_stderr\": 0.004866322258335963,\n \"acc_norm\": 0.8181637124078869,\n\ \ \"acc_norm_stderr\": 0.0038492126228151717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\ \ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\ \ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483205,\n\ \ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483205\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\ \ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389188,\n\ \ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389188\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\ \ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\ \ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\ \ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\ \ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n\ \ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\ \ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n\ \ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\ \ \"acc_stderr\": 0.04142439719489359,\n \"acc_norm\": 0.2631578947368421,\n\ \ \"acc_norm_stderr\": 0.04142439719489359\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\ \ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\ \ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\ \ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5096774193548387,\n \"acc_stderr\": 0.02843867799890955,\n \"\ acc_norm\": 0.5096774193548387,\n \"acc_norm_stderr\": 0.02843867799890955\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\ acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\ : 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\ \ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\ acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.033403619062765864,\n\ \ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.033403619062765864\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\ \ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \ \ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n\ \ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\ acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.618348623853211,\n \"acc_stderr\": 0.02082814851702258,\n \"acc_norm\"\ : 0.618348623853211,\n \"acc_norm_stderr\": 0.02082814851702258\n },\n\ \ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n\ \ \"acc_stderr\": 0.028139689444859672,\n \"acc_norm\": 0.2175925925925926,\n\ \ \"acc_norm_stderr\": 0.028139689444859672\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247273,\n\ \ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247273\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \ \ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\ \ \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.547085201793722,\n\ \ \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\ \ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\ acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04766075165356462,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04766075165356462\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.038258255488486076,\n\ \ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.038258255488486076\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\ \ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\ \ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\ \ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\ \ \"acc_stderr\": 0.028120966503914425,\n \"acc_norm\": 0.7564102564102564,\n\ \ \"acc_norm_stderr\": 0.028120966503914425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\ \ \"acc_stderr\": 0.017166362471369295,\n \"acc_norm\": 0.6398467432950191,\n\ \ \"acc_norm_stderr\": 0.017166362471369295\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637792,\n\ \ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637792\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\ \ \"acc_stderr\": 0.014614465821966337,\n \"acc_norm\": 0.2569832402234637,\n\ \ \"acc_norm_stderr\": 0.014614465821966337\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576063,\n\ \ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576063\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\ \ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\ \ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\ \ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963768,\n \ \ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963768\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37157757496740546,\n\ \ \"acc_stderr\": 0.012341828514528285,\n \"acc_norm\": 0.37157757496740546,\n\ \ \"acc_norm_stderr\": 0.012341828514528285\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\ \ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.48366013071895425,\n \"acc_stderr\": 0.02021703065318646,\n \ \ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.02021703065318646\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\ \ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\ \ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.03038726291954773,\n\ \ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.03038726291954773\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\ \ \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n\ \ \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\ \ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\ \ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\ \ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\ \ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.39776112473254976,\n\ \ \"mc2_stderr\": 0.013677730634490858\n }\n}\n```" repo_url: https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|arc:challenge|25_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hellaswag|10_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-04-26.663902.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T06_04_26.663902 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-04-26.663902.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-04-26.663902.parquet' - config_name: results data_files: - split: 2023_10_10T06_04_26.663902 path: - results_2023-10-10T06-04-26.663902.parquet - split: latest path: - results_2023-10-10T06-04-26.663902.parquet --- # Dataset Card for Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B-v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [krevas/LDCC-Instruct-Llama-2-ko-13B-v2](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T06:04:26.663902](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B-v2/blob/main/results_2023-10-10T06-04-26.663902.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.45958883488115343, "acc_stderr": 0.034511714778603424, "acc_norm": 0.4636864222606454, "acc_norm_stderr": 0.03449288105358144, "mc1": 0.2668298653610771, "mc1_stderr": 0.015483691939237265, "mc2": 0.39776112473254976, "mc2_stderr": 0.013677730634490858 }, "harness|arc:challenge|25": { "acc": 0.5298634812286689, "acc_stderr": 0.014585305840007105, "acc_norm": 0.5639931740614335, "acc_norm_stderr": 0.014491225699230916 }, "harness|hellaswag|10": { "acc": 0.6105357498506274, "acc_stderr": 0.004866322258335963, "acc_norm": 0.8181637124078869, "acc_norm_stderr": 0.0038492126228151717 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4342105263157895, "acc_stderr": 0.040335656678483205, "acc_norm": 0.4342105263157895, "acc_norm_stderr": 0.040335656678483205 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.46037735849056605, "acc_stderr": 0.030676096599389188, "acc_norm": 0.46037735849056605, "acc_norm_stderr": 0.030676096599389188 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.041614023984032786, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.041614023984032786 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3352601156069364, "acc_stderr": 0.03599586301247077, "acc_norm": 0.3352601156069364, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.17647058823529413, "acc_stderr": 0.03793281185307809, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.03793281185307809 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3617021276595745, "acc_stderr": 0.03141082197596239, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.03141082197596239 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.04142439719489359, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.04142439719489359 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3103448275862069, "acc_stderr": 0.03855289616378948, "acc_norm": 0.3103448275862069, "acc_norm_stderr": 0.03855289616378948 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.0220190800122179, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.0220190800122179 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23015873015873015, "acc_stderr": 0.03764950879790606, "acc_norm": 0.23015873015873015, "acc_norm_stderr": 0.03764950879790606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5096774193548387, "acc_stderr": 0.02843867799890955, "acc_norm": 0.5096774193548387, "acc_norm_stderr": 0.02843867799890955 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.033327690684107895, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5818181818181818, "acc_stderr": 0.03851716319398395, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.03851716319398395 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5404040404040404, "acc_stderr": 0.035507024651313425, "acc_norm": 0.5404040404040404, "acc_norm_stderr": 0.035507024651313425 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.689119170984456, "acc_stderr": 0.033403619062765864, "acc_norm": 0.689119170984456, "acc_norm_stderr": 0.033403619062765864 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.40512820512820513, "acc_stderr": 0.024890471769938145, "acc_norm": 0.40512820512820513, "acc_norm_stderr": 0.024890471769938145 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.02708037281514565, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.02708037281514565 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.40756302521008403, "acc_stderr": 0.03191863374478465, "acc_norm": 0.40756302521008403, "acc_norm_stderr": 0.03191863374478465 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.25165562913907286, "acc_stderr": 0.035433042343899844, "acc_norm": 0.25165562913907286, "acc_norm_stderr": 0.035433042343899844 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.618348623853211, "acc_stderr": 0.02082814851702258, "acc_norm": 0.618348623853211, "acc_norm_stderr": 0.02082814851702258 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2175925925925926, "acc_stderr": 0.028139689444859672, "acc_norm": 0.2175925925925926, "acc_norm_stderr": 0.028139689444859672 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6078431372549019, "acc_stderr": 0.03426712349247273, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.03426712349247273 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6244725738396625, "acc_stderr": 0.03152256243091156, "acc_norm": 0.6244725738396625, "acc_norm_stderr": 0.03152256243091156 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.547085201793722, "acc_stderr": 0.033408675019233246, "acc_norm": 0.547085201793722, "acc_norm_stderr": 0.033408675019233246 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5419847328244275, "acc_stderr": 0.04369802690578756, "acc_norm": 0.5419847328244275, "acc_norm_stderr": 0.04369802690578756 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.042059539338841226, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.042059539338841226 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356462, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356462 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6134969325153374, "acc_stderr": 0.038258255488486076, "acc_norm": 0.6134969325153374, "acc_norm_stderr": 0.038258255488486076 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.6116504854368932, "acc_stderr": 0.04825729337356389, "acc_norm": 0.6116504854368932, "acc_norm_stderr": 0.04825729337356389 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7564102564102564, "acc_stderr": 0.028120966503914425, "acc_norm": 0.7564102564102564, "acc_norm_stderr": 0.028120966503914425 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6398467432950191, "acc_stderr": 0.017166362471369295, "acc_norm": 0.6398467432950191, "acc_norm_stderr": 0.017166362471369295 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5115606936416185, "acc_stderr": 0.02691189868637792, "acc_norm": 0.5115606936416185, "acc_norm_stderr": 0.02691189868637792 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2569832402234637, "acc_stderr": 0.014614465821966337, "acc_norm": 0.2569832402234637, "acc_norm_stderr": 0.014614465821966337 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4803921568627451, "acc_stderr": 0.028607893699576063, "acc_norm": 0.4803921568627451, "acc_norm_stderr": 0.028607893699576063 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5691318327974276, "acc_stderr": 0.028125340983972714, "acc_norm": 0.5691318327974276, "acc_norm_stderr": 0.028125340983972714 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.027628737155668777, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.027628737155668777 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.35106382978723405, "acc_stderr": 0.028473501272963768, "acc_norm": 0.35106382978723405, "acc_norm_stderr": 0.028473501272963768 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.37157757496740546, "acc_stderr": 0.012341828514528285, "acc_norm": 0.37157757496740546, "acc_norm_stderr": 0.012341828514528285 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3713235294117647, "acc_stderr": 0.02934980313976587, "acc_norm": 0.3713235294117647, "acc_norm_stderr": 0.02934980313976587 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.48366013071895425, "acc_stderr": 0.02021703065318646, "acc_norm": 0.48366013071895425, "acc_norm_stderr": 0.02021703065318646 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545483, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.34285714285714286, "acc_stderr": 0.03038726291954773, "acc_norm": 0.34285714285714286, "acc_norm_stderr": 0.03038726291954773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6368159203980099, "acc_stderr": 0.03400598505599014, "acc_norm": 0.6368159203980099, "acc_norm_stderr": 0.03400598505599014 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.03424042924691584, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.03424042924691584 }, "harness|truthfulqa:mc|0": { "mc1": 0.2668298653610771, "mc1_stderr": 0.015483691939237265, "mc2": 0.39776112473254976, "mc2_stderr": 0.013677730634490858 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,129
[ [ -0.048309326171875, -0.058197021484375, 0.0201873779296875, 0.015625, -0.013641357421875, 0.000278472900390625, 0.0031452178955078125, -0.0180816650390625, 0.039154052734375, -0.0027866363525390625, -0.034271240234375, -0.04718017578125, -0.033660888671875, ...
kerk86/fast-stable-diffusion
2023-10-18T14:49:14.000Z
[ "region:us" ]
kerk86
null
null
0
0
2023-10-10T06:32:35
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1
2023-10-24T09:44:01.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T06:39:11
--- pretty_name: Evaluation run of mistralai/Mistral-7B-Instruct-v0.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T09:43:48.997990](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-24T09-43-48.997990.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37038590604026844,\n\ \ \"em_stderr\": 0.00494543044549648,\n \"f1\": 0.43100566275167973,\n\ \ \"f1_stderr\": 0.00478990485809286,\n \"acc\": 0.4398533245809979,\n\ \ \"acc_stderr\": 0.01100025548646791\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.37038590604026844,\n \"em_stderr\": 0.00494543044549648,\n\ \ \"f1\": 0.43100566275167973,\n \"f1_stderr\": 0.00478990485809286\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \ \ \"acc_stderr\": 0.009629588445673814\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262006\n\ \ }\n}\n```" repo_url: https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T09_43_48.997990 path: - '**/details_harness|drop|3_2023-10-24T09-43-48.997990.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T09-43-48.997990.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T09_43_48.997990 path: - '**/details_harness|gsm8k|5_2023-10-24T09-43-48.997990.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T09-43-48.997990.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T06_38_48.353025 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T09_43_48.997990 path: - '**/details_harness|winogrande|5_2023-10-24T09-43-48.997990.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T09-43-48.997990.parquet' - config_name: results data_files: - split: 2023_10_10T06_38_48.353025 path: - results_2023-10-10T06-38-48.353025.parquet - split: 2023_10_24T09_43_48.997990 path: - results_2023-10-24T09-43-48.997990.parquet - split: latest path: - results_2023-10-24T09-43-48.997990.parquet --- # Dataset Card for Evaluation run of mistralai/Mistral-7B-Instruct-v0.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T09:43:48.997990](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-24T09-43-48.997990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.37038590604026844, "em_stderr": 0.00494543044549648, "f1": 0.43100566275167973, "f1_stderr": 0.00478990485809286, "acc": 0.4398533245809979, "acc_stderr": 0.01100025548646791 }, "harness|drop|3": { "em": 0.37038590604026844, "em_stderr": 0.00494543044549648, "f1": 0.43100566275167973, "f1_stderr": 0.00478990485809286 }, "harness|gsm8k|5": { "acc": 0.1425322213798332, "acc_stderr": 0.009629588445673814 }, "harness|winogrande|5": { "acc": 0.7371744277821626, "acc_stderr": 0.012370922527262006 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,708
[ [ -0.0298004150390625, -0.045074462890625, 0.01120758056640625, 0.0193328857421875, -0.01403045654296875, -0.0027637481689453125, -0.023773193359375, -0.01275634765625, 0.0220489501953125, 0.03955078125, -0.050384521484375, -0.0650634765625, -0.048370361328125, ...
open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct
2023-10-10T06:58:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T06:57:27
--- pretty_name: Evaluation run of maywell/Synatra-V0.1-7B-Instruct dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [maywell/Synatra-V0.1-7B-Instruct](https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T06:57:04.221099](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct/blob/main/results_2023-10-10T06-57-04.221099.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5526128967911348,\n\ \ \"acc_stderr\": 0.03462386322845972,\n \"acc_norm\": 0.5565308458707837,\n\ \ \"acc_norm_stderr\": 0.0346104765546302,\n \"mc1\": 0.390452876376989,\n\ \ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\ \ \"mc2_stderr\": 0.015250255723495946\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490975,\n\ \ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5701055566620196,\n\ \ \"acc_stderr\": 0.004940490508240653,\n \"acc_norm\": 0.7662816172077276,\n\ \ \"acc_norm_stderr\": 0.004223302177263008\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\ \ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\ \ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\ \ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\ \ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\ \ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\ \ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\ \ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\ \ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\ \ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\ \ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\ acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\ \ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\ \ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n\ \ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\ \ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296736,\n\ \ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296736\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\ \ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\ \ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\ acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\ : 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\ \ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\ \ 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"\ acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\ \ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\ \ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\ \ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\ acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\ \ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\ \ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\ \ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\ \ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\ \ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\ \ \"acc_stderr\": 0.01574549716904904,\n \"acc_norm\": 0.7369093231162197,\n\ \ \"acc_norm_stderr\": 0.01574549716904904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\ \ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\ \ \"acc_stderr\": 0.014655780837497717,\n \"acc_norm\": 0.25921787709497207,\n\ \ \"acc_norm_stderr\": 0.014655780837497717\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\ \ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\ \ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\ \ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\ \ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614098,\n \ \ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614098\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\ \ \"acc_stderr\": 0.012520315120147103,\n \"acc_norm\": 0.4015645371577575,\n\ \ \"acc_norm_stderr\": 0.012520315120147103\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\ \ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \ \ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\ \ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\ \ \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n\ \ \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\ \ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\ \ \"mc1_stderr\": 0.017078230743431445,\n \"mc2\": 0.557562665558094,\n\ \ \"mc2_stderr\": 0.015250255723495946\n }\n}\n```" repo_url: https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|arc:challenge|25_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hellaswag|10_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-57-04.221099.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T06_57_04.221099 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-57-04.221099.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T06-57-04.221099.parquet' - config_name: results data_files: - split: 2023_10_10T06_57_04.221099 path: - results_2023-10-10T06-57-04.221099.parquet - split: latest path: - results_2023-10-10T06-57-04.221099.parquet --- # Dataset Card for Evaluation run of maywell/Synatra-V0.1-7B-Instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [maywell/Synatra-V0.1-7B-Instruct](https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T06:57:04.221099](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct/blob/main/results_2023-10-10T06-57-04.221099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5526128967911348, "acc_stderr": 0.03462386322845972, "acc_norm": 0.5565308458707837, "acc_norm_stderr": 0.0346104765546302, "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431445, "mc2": 0.557562665558094, "mc2_stderr": 0.015250255723495946 }, "harness|arc:challenge|25": { "acc": 0.5179180887372014, "acc_stderr": 0.014602005585490975, "acc_norm": 0.552901023890785, "acc_norm_stderr": 0.014529380160526848 }, "harness|hellaswag|10": { "acc": 0.5701055566620196, "acc_stderr": 0.004940490508240653, "acc_norm": 0.7662816172077276, "acc_norm_stderr": 0.004223302177263008 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5460526315789473, "acc_stderr": 0.04051646342874143, "acc_norm": 0.5460526315789473, "acc_norm_stderr": 0.04051646342874143 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283647, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929776, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929776 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.03260038511835771, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.03260038511835771 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.02494236893115979, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.02494236893115979 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.02743086657997347, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.02743086657997347 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4039408866995074, "acc_stderr": 0.034524539038220406, "acc_norm": 0.4039408866995074, "acc_norm_stderr": 0.034524539038220406 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.035679697722680495, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.035679697722680495 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.032424979581788166, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.028979089794296736, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.028979089794296736 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5358974358974359, "acc_stderr": 0.025285585990017848, "acc_norm": 0.5358974358974359, "acc_norm_stderr": 0.025285585990017848 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02831753349606648, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02831753349606648 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.032385469487589795, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7577981651376147, "acc_stderr": 0.01836817630659862, "acc_norm": 0.7577981651376147, "acc_norm_stderr": 0.01836817630659862 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037892, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037892 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7088607594936709, "acc_stderr": 0.029571601065753374, "acc_norm": 0.7088607594936709, "acc_norm_stderr": 0.029571601065753374 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969638, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969638 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.043457245702925335, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.043457245702925335 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497752, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497752 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6503067484662577, "acc_stderr": 0.03746668325470021, "acc_norm": 0.6503067484662577, "acc_norm_stderr": 0.03746668325470021 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7369093231162197, "acc_stderr": 0.01574549716904904, "acc_norm": 0.7369093231162197, "acc_norm_stderr": 0.01574549716904904 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6040462427745664, "acc_stderr": 0.02632981334194624, "acc_norm": 0.6040462427745664, "acc_norm_stderr": 0.02632981334194624 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25921787709497207, "acc_stderr": 0.014655780837497717, "acc_norm": 0.25921787709497207, "acc_norm_stderr": 0.014655780837497717 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5947712418300654, "acc_stderr": 0.028110928492809075, "acc_norm": 0.5947712418300654, "acc_norm_stderr": 0.028110928492809075 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.02755994980234782, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.02755994980234782 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6203703703703703, "acc_stderr": 0.027002521034516468, "acc_norm": 0.6203703703703703, "acc_norm_stderr": 0.027002521034516468 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614098, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614098 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4015645371577575, "acc_stderr": 0.012520315120147103, "acc_norm": 0.4015645371577575, "acc_norm_stderr": 0.012520315120147103 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5220588235294118, "acc_stderr": 0.030343264224213514, "acc_norm": 0.5220588235294118, "acc_norm_stderr": 0.030343264224213514 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5473856209150327, "acc_stderr": 0.020136790918492523, "acc_norm": 0.5473856209150327, "acc_norm_stderr": 0.020136790918492523 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726496, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726496 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6915422885572139, "acc_stderr": 0.032658195885126966, "acc_norm": 0.6915422885572139, "acc_norm_stderr": 0.032658195885126966 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431445, "mc2": 0.557562665558094, "mc2_stderr": 0.015250255723495946 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,008
[ [ -0.04644775390625, -0.056182861328125, 0.019561767578125, 0.010498046875, -0.01055145263671875, -0.00661468505859375, 0.0009236335754394531, -0.01537322998046875, 0.0401611328125, -0.004749298095703125, -0.0340576171875, -0.049591064453125, -0.032135009765625, ...
open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30
2023-10-26T02:42:49.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T07:01:37
--- pretty_name: Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T02:42:36.258115](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-26T02-42-36.258115.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2954068791946309,\n\ \ \"em_stderr\": 0.004672175556184236,\n \"f1\": 0.3814209312080561,\n\ \ \"f1_stderr\": 0.004573085663083055,\n \"acc\": 0.44525521893903264,\n\ \ \"acc_stderr\": 0.012103729416391124\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2954068791946309,\n \"em_stderr\": 0.004672175556184236,\n\ \ \"f1\": 0.3814209312080561,\n \"f1_stderr\": 0.004573085663083055\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20621683093252463,\n \ \ \"acc_stderr\": 0.011144364089781436\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n\ \ }\n}\n```" repo_url: https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T02_42_36.258115 path: - '**/details_harness|drop|3_2023-10-26T02-42-36.258115.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T02-42-36.258115.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T02_42_36.258115 path: - '**/details_harness|gsm8k|5_2023-10-26T02-42-36.258115.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T02-42-36.258115.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T07_01_15.573690 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T02_42_36.258115 path: - '**/details_harness|winogrande|5_2023-10-26T02-42-36.258115.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T02-42-36.258115.parquet' - config_name: results data_files: - split: 2023_10_10T07_01_15.573690 path: - results_2023-10-10T07-01-15.573690.parquet - split: 2023_10_26T02_42_36.258115 path: - results_2023-10-26T02-42-36.258115.parquet - split: latest path: - results_2023-10-26T02-42-36.258115.parquet --- # Dataset Card for Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T02:42:36.258115](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-26T02-42-36.258115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2954068791946309, "em_stderr": 0.004672175556184236, "f1": 0.3814209312080561, "f1_stderr": 0.004573085663083055, "acc": 0.44525521893903264, "acc_stderr": 0.012103729416391124 }, "harness|drop|3": { "em": 0.2954068791946309, "em_stderr": 0.004672175556184236, "f1": 0.3814209312080561, "f1_stderr": 0.004573085663083055 }, "harness|gsm8k|5": { "acc": 0.20621683093252463, "acc_stderr": 0.011144364089781436 }, "harness|winogrande|5": { "acc": 0.6842936069455406, "acc_stderr": 0.01306309474300081 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,700
[ [ -0.02447509765625, -0.05316162109375, 0.01067352294921875, 0.0194854736328125, -0.00974273681640625, 0.00843048095703125, -0.035247802734375, -0.011871337890625, 0.033599853515625, 0.042694091796875, -0.046295166015625, -0.06719970703125, -0.050262451171875, ...
open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied
2023-10-26T06:44:37.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T07:26:05
--- pretty_name: Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T06:44:24.493952](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-26T06-44-24.493952.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\ \ \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.0585476090604028,\n\ \ \"f1_stderr\": 0.0013740361163735455,\n \"acc\": 0.3926358910777041,\n\ \ \"acc_stderr\": 0.010089987799825416\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n\ \ \"f1\": 0.0585476090604028,\n \"f1_stderr\": 0.0013740361163735455\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \ \ \"acc_stderr\": 0.007390654481108214\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.01278932111854262\n\ \ }\n}\n```" repo_url: https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T06_44_24.493952 path: - '**/details_harness|drop|3_2023-10-26T06-44-24.493952.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T06-44-24.493952.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T06_44_24.493952 path: - '**/details_harness|gsm8k|5_2023-10-26T06-44-24.493952.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T06-44-24.493952.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T07_25_43.126145 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T06_44_24.493952 path: - '**/details_harness|winogrande|5_2023-10-26T06-44-24.493952.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T06-44-24.493952.parquet' - config_name: results data_files: - split: 2023_10_10T07_25_43.126145 path: - results_2023-10-10T07-25-43.126145.parquet - split: 2023_10_26T06_44_24.493952 path: - results_2023-10-26T06-44-24.493952.parquet - split: latest path: - results_2023-10-26T06-44-24.493952.parquet --- # Dataset Card for Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T06:44:24.493952](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-26T06-44-24.493952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.0004058451132417743, "f1": 0.0585476090604028, "f1_stderr": 0.0013740361163735455, "acc": 0.3926358910777041, "acc_stderr": 0.010089987799825416 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.0004058451132417743, "f1": 0.0585476090604028, "f1_stderr": 0.0013740361163735455 }, "harness|gsm8k|5": { "acc": 0.07808946171341925, "acc_stderr": 0.007390654481108214 }, "harness|winogrande|5": { "acc": 0.7071823204419889, "acc_stderr": 0.01278932111854262 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,738
[ [ -0.0263519287109375, -0.051025390625, 0.01027679443359375, 0.0257720947265625, -0.00917816162109375, 0.007137298583984375, -0.0330810546875, -0.01165008544921875, 0.02728271484375, 0.0440673828125, -0.046844482421875, -0.06658935546875, -0.04937744140625, 0....
open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied
2023-10-27T01:55:32.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T07:31:24
--- pretty_name: Evaluation run of hiyouga/Baichuan2-7B-Chat-LLaMAfied dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hiyouga/Baichuan2-7B-Chat-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T01:55:17.464897](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied/blob/main/results_2023-10-27T01-55-17.464897.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2062709731543624,\n\ \ \"em_stderr\": 0.004143762363131985,\n \"f1\": 0.26938129194630883,\n\ \ \"f1_stderr\": 0.004172682699820514,\n \"acc\": 0.40028530858265426,\n\ \ \"acc_stderr\": 0.010786124750718863\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2062709731543624,\n \"em_stderr\": 0.004143762363131985,\n\ \ \"f1\": 0.26938129194630883,\n \"f1_stderr\": 0.004172682699820514\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \ \ \"acc_stderr\": 0.008590089300511151\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.691397000789266,\n \"acc_stderr\": 0.012982160200926577\n\ \ }\n}\n```" repo_url: https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|arc:challenge|25_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T07-31-02.024016.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T01_55_17.464897 path: - '**/details_harness|drop|3_2023-10-27T01-55-17.464897.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T01-55-17.464897.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T01_55_17.464897 path: - '**/details_harness|gsm8k|5_2023-10-27T01-55-17.464897.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T01-55-17.464897.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hellaswag|10_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-31-02.024016.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T07_31_02.024016 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-31-02.024016.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-31-02.024016.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T01_55_17.464897 path: - '**/details_harness|winogrande|5_2023-10-27T01-55-17.464897.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T01-55-17.464897.parquet' - config_name: results data_files: - split: 2023_10_10T07_31_02.024016 path: - results_2023-10-10T07-31-02.024016.parquet - split: 2023_10_27T01_55_17.464897 path: - results_2023-10-27T01-55-17.464897.parquet - split: latest path: - results_2023-10-27T01-55-17.464897.parquet --- # Dataset Card for Evaluation run of hiyouga/Baichuan2-7B-Chat-LLaMAfied ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hiyouga/Baichuan2-7B-Chat-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T01:55:17.464897](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied/blob/main/results_2023-10-27T01-55-17.464897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2062709731543624, "em_stderr": 0.004143762363131985, "f1": 0.26938129194630883, "f1_stderr": 0.004172682699820514, "acc": 0.40028530858265426, "acc_stderr": 0.010786124750718863 }, "harness|drop|3": { "em": 0.2062709731543624, "em_stderr": 0.004143762363131985, "f1": 0.26938129194630883, "f1_stderr": 0.004172682699820514 }, "harness|gsm8k|5": { "acc": 0.10917361637604246, "acc_stderr": 0.008590089300511151 }, "harness|winogrande|5": { "acc": 0.691397000789266, "acc_stderr": 0.012982160200926577 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,728
[ [ -0.02490234375, -0.056396484375, 0.00733184814453125, 0.0278778076171875, -0.0074310302734375, 0.0075225830078125, -0.035125732421875, -0.01312255859375, 0.02850341796875, 0.04156494140625, -0.04852294921875, -0.06500244140625, -0.0501708984375, 0.0062255859...
open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat
2023-10-10T07:41:08.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T07:40:08
--- pretty_name: Evaluation run of JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T07:39:47.100914](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat/blob/main/results_2023-10-10T07-39-47.100914.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44291345536273574,\n\ \ \"acc_stderr\": 0.03513140512866742,\n \"acc_norm\": 0.4461672418802564,\n\ \ \"acc_norm_stderr\": 0.03512502259107083,\n \"mc1\": 0.2864137086903305,\n\ \ \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.428667116433953,\n\ \ \"mc2_stderr\": 0.015095774970188642\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.45733788395904434,\n \"acc_stderr\": 0.014558106543924068,\n\ \ \"acc_norm\": 0.4735494880546075,\n \"acc_norm_stderr\": 0.014590931358120172\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5238996215893248,\n\ \ \"acc_stderr\": 0.004984077906216098,\n \"acc_norm\": 0.6996614220274846,\n\ \ \"acc_norm_stderr\": 0.004574683373821049\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\ \ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\ \ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\ \ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255655,\n\ \ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255655\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\ \ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\ \ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\ \ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\ \ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\ \ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\ \ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\ \ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n\ \ \"acc_stderr\": 0.028358634859836935,\n \"acc_norm\": 0.5387096774193548,\n\ \ \"acc_norm_stderr\": 0.028358634859836935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\ \ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\ : 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.038592681420702615,\n\ \ \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.038592681420702615\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5656565656565656,\n \"acc_stderr\": 0.035315058793591834,\n \"\ acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.035315058793591834\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048574,\n\ \ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048574\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.025028610276710855,\n\ \ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.025028610276710855\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \ \ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.032284106267163895,\n\ \ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.032284106267163895\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"\ acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.5724770642201835,\n \"acc_stderr\": 0.021210910204300437,\n \"\ acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.021210910204300437\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n \"\ acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.5490196078431373,\n \"acc_stderr\": 0.034924061041636124,\n \"\ acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.034924061041636124\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370672,\n \ \ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370672\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\ \ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n\ \ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\ \ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\ acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\ \ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\ \ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\ \ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n\ \ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\ \ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n\ \ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6143039591315453,\n\ \ \"acc_stderr\": 0.017406476619212904,\n \"acc_norm\": 0.6143039591315453,\n\ \ \"acc_norm_stderr\": 0.017406476619212904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.026882643434022885,\n\ \ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.026882643434022885\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\ \ \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.27150837988826815,\n\ \ \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.028358956313423545,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.028358956313423545\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\ \ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\ \ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327242,\n\ \ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327242\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509317,\n \ \ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509317\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n\ \ \"acc_stderr\": 0.01218255231321517,\n \"acc_norm\": 0.3500651890482399,\n\ \ \"acc_norm_stderr\": 0.01218255231321517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031236,\n\ \ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031236\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.42483660130718953,\n \"acc_stderr\": 0.019997973035458336,\n \ \ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.019997973035458336\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\ \ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\ \ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\ \ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\ \ \"acc_stderr\": 0.03487558640462064,\n \"acc_norm\": 0.582089552238806,\n\ \ \"acc_norm_stderr\": 0.03487558640462064\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\ \ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\ \ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.037627386999170565,\n\ \ \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.037627386999170565\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\ \ \"mc1_stderr\": 0.015826142439502346,\n \"mc2\": 0.428667116433953,\n\ \ \"mc2_stderr\": 0.015095774970188642\n }\n}\n```" repo_url: https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|arc:challenge|25_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hellaswag|10_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-39-47.100914.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T07_39_47.100914 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-39-47.100914.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T07-39-47.100914.parquet' - config_name: results data_files: - split: 2023_10_10T07_39_47.100914 path: - results_2023-10-10T07-39-47.100914.parquet - split: latest path: - results_2023-10-10T07-39-47.100914.parquet --- # Dataset Card for Evaluation run of JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-VL-LLaMAfied-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T07:39:47.100914](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-VL-LLaMAfied-7B-Chat/blob/main/results_2023-10-10T07-39-47.100914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.44291345536273574, "acc_stderr": 0.03513140512866742, "acc_norm": 0.4461672418802564, "acc_norm_stderr": 0.03512502259107083, "mc1": 0.2864137086903305, "mc1_stderr": 0.015826142439502346, "mc2": 0.428667116433953, "mc2_stderr": 0.015095774970188642 }, "harness|arc:challenge|25": { "acc": 0.45733788395904434, "acc_stderr": 0.014558106543924068, "acc_norm": 0.4735494880546075, "acc_norm_stderr": 0.014590931358120172 }, "harness|hellaswag|10": { "acc": 0.5238996215893248, "acc_stderr": 0.004984077906216098, "acc_norm": 0.6996614220274846, "acc_norm_stderr": 0.004574683373821049 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4074074074074074, "acc_stderr": 0.04244633238353228, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.42105263157894735, "acc_stderr": 0.04017901275981749, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.04017901275981749 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4679245283018868, "acc_stderr": 0.03070948699255655, "acc_norm": 0.4679245283018868, "acc_norm_stderr": 0.03070948699255655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3472222222222222, "acc_stderr": 0.039812405437178615, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421255, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421255 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3872832369942196, "acc_stderr": 0.037143259063020656, "acc_norm": 0.3872832369942196, "acc_norm_stderr": 0.037143259063020656 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.03177821250236922, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.03177821250236922 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4482758620689655, "acc_stderr": 0.04144311810878151, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.022261817692400175, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.022261817692400175 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.040735243221471255, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.040735243221471255 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5387096774193548, "acc_stderr": 0.028358634859836935, "acc_norm": 0.5387096774193548, "acc_norm_stderr": 0.028358634859836935 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.33497536945812806, "acc_stderr": 0.033208527423483104, "acc_norm": 0.33497536945812806, "acc_norm_stderr": 0.033208527423483104 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.42424242424242425, "acc_stderr": 0.038592681420702615, "acc_norm": 0.42424242424242425, "acc_norm_stderr": 0.038592681420702615 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5656565656565656, "acc_stderr": 0.035315058793591834, "acc_norm": 0.5656565656565656, "acc_norm_stderr": 0.035315058793591834 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6269430051813472, "acc_stderr": 0.03490205592048574, "acc_norm": 0.6269430051813472, "acc_norm_stderr": 0.03490205592048574 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4205128205128205, "acc_stderr": 0.025028610276710855, "acc_norm": 0.4205128205128205, "acc_norm_stderr": 0.025028610276710855 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.025928876132766118, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.025928876132766118 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.44537815126050423, "acc_stderr": 0.032284106267163895, "acc_norm": 0.44537815126050423, "acc_norm_stderr": 0.032284106267163895 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23178807947019867, "acc_stderr": 0.03445406271987053, "acc_norm": 0.23178807947019867, "acc_norm_stderr": 0.03445406271987053 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5724770642201835, "acc_stderr": 0.021210910204300437, "acc_norm": 0.5724770642201835, "acc_norm_stderr": 0.021210910204300437 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3055555555555556, "acc_stderr": 0.031415546294025445, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.031415546294025445 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5490196078431373, "acc_stderr": 0.034924061041636124, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.034924061041636124 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5949367088607594, "acc_stderr": 0.03195514741370672, "acc_norm": 0.5949367088607594, "acc_norm_stderr": 0.03195514741370672 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.57847533632287, "acc_stderr": 0.033141902221106564, "acc_norm": 0.57847533632287, "acc_norm_stderr": 0.033141902221106564 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4351145038167939, "acc_stderr": 0.04348208051644858, "acc_norm": 0.4351145038167939, "acc_norm_stderr": 0.04348208051644858 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6611570247933884, "acc_stderr": 0.043207678075366705, "acc_norm": 0.6611570247933884, "acc_norm_stderr": 0.043207678075366705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5370370370370371, "acc_stderr": 0.04820403072760627, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.04820403072760627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4785276073619632, "acc_stderr": 0.0392474687675113, "acc_norm": 0.4785276073619632, "acc_norm_stderr": 0.0392474687675113 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.5048543689320388, "acc_stderr": 0.049505043821289195, "acc_norm": 0.5048543689320388, "acc_norm_stderr": 0.049505043821289195 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7521367521367521, "acc_stderr": 0.028286324075564393, "acc_norm": 0.7521367521367521, "acc_norm_stderr": 0.028286324075564393 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6143039591315453, "acc_stderr": 0.017406476619212904, "acc_norm": 0.6143039591315453, "acc_norm_stderr": 0.017406476619212904 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5260115606936416, "acc_stderr": 0.026882643434022885, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.026882643434022885 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27150837988826815, "acc_stderr": 0.014874252168095268, "acc_norm": 0.27150837988826815, "acc_norm_stderr": 0.014874252168095268 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.43137254901960786, "acc_stderr": 0.028358956313423545, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.028358956313423545 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.4758842443729904, "acc_stderr": 0.028365041542564577, "acc_norm": 0.4758842443729904, "acc_norm_stderr": 0.028365041542564577 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4876543209876543, "acc_stderr": 0.027812262269327242, "acc_norm": 0.4876543209876543, "acc_norm_stderr": 0.027812262269327242 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.31560283687943264, "acc_stderr": 0.027724989449509317, "acc_norm": 0.31560283687943264, "acc_norm_stderr": 0.027724989449509317 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3500651890482399, "acc_stderr": 0.01218255231321517, "acc_norm": 0.3500651890482399, "acc_norm_stderr": 0.01218255231321517 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.31985294117647056, "acc_stderr": 0.028332959514031236, "acc_norm": 0.31985294117647056, "acc_norm_stderr": 0.028332959514031236 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.42483660130718953, "acc_stderr": 0.019997973035458336, "acc_norm": 0.42483660130718953, "acc_norm_stderr": 0.019997973035458336 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5551020408163265, "acc_stderr": 0.031814251181977865, "acc_norm": 0.5551020408163265, "acc_norm_stderr": 0.031814251181977865 }, "harness|hendrycksTest-sociology|5": { "acc": 0.582089552238806, "acc_stderr": 0.03487558640462064, "acc_norm": 0.582089552238806, "acc_norm_stderr": 0.03487558640462064 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.03836722176598052, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5964912280701754, "acc_stderr": 0.037627386999170565, "acc_norm": 0.5964912280701754, "acc_norm_stderr": 0.037627386999170565 }, "harness|truthfulqa:mc|0": { "mc1": 0.2864137086903305, "mc1_stderr": 0.015826142439502346, "mc2": 0.428667116433953, "mc2_stderr": 0.015095774970188642 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,187
[ [ -0.048583984375, -0.059906005859375, 0.017974853515625, 0.01509857177734375, -0.01235198974609375, -0.0028018951416015625, -0.0006303787231445312, -0.0157318115234375, 0.04095458984375, -0.0011835098266601562, -0.03387451171875, -0.04901123046875, -0.03048706054...
ravivishwakarmauzio/finetuning_llama2
2023-10-10T09:09:07.000Z
[ "region:us" ]
ravivishwakarmauzio
null
null
0
0
2023-10-10T07:52:35
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 338808 num_examples: 200 download_size: 0 dataset_size: 338808 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "finetuning_llama2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
436
[ [ -0.033355712890625, -0.017669677734375, 0.01371002197265625, 0.025482177734375, -0.033843994140625, -0.0067291259765625, 0.0191497802734375, -0.0174407958984375, 0.050811767578125, 0.035491943359375, -0.054656982421875, -0.049774169921875, -0.0458984375, -0....
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial
2023-10-25T02:58:34.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T08:03:52
--- pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Charlie911/vicuna-7b-v1.5-lora-timedial](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T02:58:22.436019](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial/blob/main/results_2023-10-25T02-58-22.436019.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004299496644295302,\n\ \ \"em_stderr\": 0.0006700586558629934,\n \"f1\": 0.06840499161073847,\n\ \ \"f1_stderr\": 0.001566173833045158,\n \"acc\": 0.40418915336712596,\n\ \ \"acc_stderr\": 0.009775164829075637\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.004299496644295302,\n \"em_stderr\": 0.0006700586558629934,\n\ \ \"f1\": 0.06840499161073847,\n \"f1_stderr\": 0.001566173833045158\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07278241091736164,\n \ \ \"acc_stderr\": 0.007155604761167476\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983797\n\ \ }\n}\n```" repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|arc:challenge|25_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T08-03-27.841263.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T02_58_22.436019 path: - '**/details_harness|drop|3_2023-10-25T02-58-22.436019.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T02-58-22.436019.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T02_58_22.436019 path: - '**/details_harness|gsm8k|5_2023-10-25T02-58-22.436019.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T02-58-22.436019.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hellaswag|10_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-03-27.841263.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T08_03_27.841263 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-03-27.841263.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-03-27.841263.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T02_58_22.436019 path: - '**/details_harness|winogrande|5_2023-10-25T02-58-22.436019.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T02-58-22.436019.parquet' - config_name: results data_files: - split: 2023_10_10T08_03_27.841263 path: - results_2023-10-10T08-03-27.841263.parquet - split: 2023_10_25T02_58_22.436019 path: - results_2023-10-25T02-58-22.436019.parquet - split: latest path: - results_2023-10-25T02-58-22.436019.parquet --- # Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-timedial](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T02:58:22.436019](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial/blob/main/results_2023-10-25T02-58-22.436019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004299496644295302, "em_stderr": 0.0006700586558629934, "f1": 0.06840499161073847, "f1_stderr": 0.001566173833045158, "acc": 0.40418915336712596, "acc_stderr": 0.009775164829075637 }, "harness|drop|3": { "em": 0.004299496644295302, "em_stderr": 0.0006700586558629934, "f1": 0.06840499161073847, "f1_stderr": 0.001566173833045158 }, "harness|gsm8k|5": { "acc": 0.07278241091736164, "acc_stderr": 0.007155604761167476 }, "harness|winogrande|5": { "acc": 0.7355958958168903, "acc_stderr": 0.012394724896983797 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,790
[ [ -0.026947021484375, -0.05523681640625, 0.0179901123046875, 0.021148681640625, -0.0175628662109375, 0.003711700439453125, -0.021270751953125, -0.018768310546875, 0.03704833984375, 0.044342041015625, -0.048919677734375, -0.06964111328125, -0.04388427734375, 0....
open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30
2023-10-10T08:18:43.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T08:17:43
--- pretty_name: Evaluation run of JosephusCheung/Pwen-VL-Chat-20_30 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [JosephusCheung/Pwen-VL-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T08:17:20.929764](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30/blob/main/results_2023-10-10T08-17-20.929764.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614045523007456,\n\ \ \"acc_stderr\": 0.034472805150990236,\n \"acc_norm\": 0.5650409022375938,\n\ \ \"acc_norm_stderr\": 0.03446466967324352,\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\ \ \"mc2_stderr\": 0.01461529390566251\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.014586776355294316,\n\ \ \"acc_norm\": 0.5017064846416383,\n \"acc_norm_stderr\": 0.01461130570505699\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5382393945429197,\n\ \ \"acc_stderr\": 0.004975167382061832,\n \"acc_norm\": 0.7220673172674766,\n\ \ \"acc_norm_stderr\": 0.004470644845242893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.043192236258113324,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.043192236258113324\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981748,\n\ \ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981748\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\ \ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \ \ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\ \ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\ \ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n\ \ \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n\ \ \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\ \ 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"\ acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\ acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\ \ \"acc_stderr\": 0.026593084516572284,\n \"acc_norm\": 0.6774193548387096,\n\ \ \"acc_norm_stderr\": 0.026593084516572284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\ \ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\ : 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\ \ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\ acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\ \ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\ \ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \ \ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708443,\n \"\ acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708443\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\ acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\ acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \ \ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\ \ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\ \ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\ \ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\ \ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\ \ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\ \ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\ \ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\ \ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\ \ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\ \ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\ \ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\ \ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\ \ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\ \ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\ \ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\ \ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\ \ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\ \ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \ \ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\ \ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\ \ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \ \ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\ \ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\ \ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087548,\n\ \ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087548\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\ \ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\ \ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \ \ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\ \ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\ \ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\ \ \"mc2_stderr\": 0.01461529390566251\n }\n}\n```" repo_url: https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|arc:challenge|25_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hellaswag|10_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-17-20.929764.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T08_17_20.929764 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-17-20.929764.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-17-20.929764.parquet' - config_name: results data_files: - split: 2023_10_10T08_17_20.929764 path: - results_2023-10-10T08-17-20.929764.parquet - split: latest path: - results_2023-10-10T08-17-20.929764.parquet --- # Dataset Card for Evaluation run of JosephusCheung/Pwen-VL-Chat-20_30 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-VL-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-VL-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T08:17:20.929764](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-VL-Chat-20_30/blob/main/results_2023-10-10T08-17-20.929764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5614045523007456, "acc_stderr": 0.034472805150990236, "acc_norm": 0.5650409022375938, "acc_norm_stderr": 0.03446466967324352, "mc1": 0.2913096695226438, "mc1_stderr": 0.015905987048184828, "mc2": 0.42517178573631115, "mc2_stderr": 0.01461529390566251 }, "harness|arc:challenge|25": { "acc": 0.4709897610921502, "acc_stderr": 0.014586776355294316, "acc_norm": 0.5017064846416383, "acc_norm_stderr": 0.01461130570505699 }, "harness|hellaswag|10": { "acc": 0.5382393945429197, "acc_stderr": 0.004975167382061832, "acc_norm": 0.7220673172674766, "acc_norm_stderr": 0.004470644845242893 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.043192236258113324, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.043192236258113324 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5789473684210527, "acc_stderr": 0.04017901275981748, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.04017901275981748 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.04793724854411019, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6415094339622641, "acc_stderr": 0.02951470358398177, "acc_norm": 0.6415094339622641, "acc_norm_stderr": 0.02951470358398177 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.502127659574468, "acc_stderr": 0.032685726586674915, "acc_norm": 0.502127659574468, "acc_norm_stderr": 0.032685726586674915 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.0416180850350153, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3412698412698413, "acc_stderr": 0.024419234966819067, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.024419234966819067 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6774193548387096, "acc_stderr": 0.026593084516572284, "acc_norm": 0.6774193548387096, "acc_norm_stderr": 0.026593084516572284 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4187192118226601, "acc_stderr": 0.03471192860518468, "acc_norm": 0.4187192118226601, "acc_norm_stderr": 0.03471192860518468 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2606060606060606, "acc_stderr": 0.03427743175816524, "acc_norm": 0.2606060606060606, "acc_norm_stderr": 0.03427743175816524 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7373737373737373, "acc_stderr": 0.03135305009533084, "acc_norm": 0.7373737373737373, "acc_norm_stderr": 0.03135305009533084 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.02897908979429673, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5128205128205128, "acc_stderr": 0.025342671293807257, "acc_norm": 0.5128205128205128, "acc_norm_stderr": 0.025342671293807257 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.02696242432507382, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.02696242432507382 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5462184873949579, "acc_stderr": 0.03233943468182088, "acc_norm": 0.5462184873949579, "acc_norm_stderr": 0.03233943468182088 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7541284403669725, "acc_stderr": 0.018461940968708443, "acc_norm": 0.7541284403669725, "acc_norm_stderr": 0.018461940968708443 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.41203703703703703, "acc_stderr": 0.03356787758160835, "acc_norm": 0.41203703703703703, "acc_norm_stderr": 0.03356787758160835 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6029411764705882, "acc_stderr": 0.03434131164719129, "acc_norm": 0.6029411764705882, "acc_norm_stderr": 0.03434131164719129 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.02765215314415927, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.02765215314415927 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516302, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516302 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.04453197507374984, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.04453197507374984 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6441717791411042, "acc_stderr": 0.03761521380046734, "acc_norm": 0.6441717791411042, "acc_norm_stderr": 0.03761521380046734 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.047184714852195886, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.047184714852195886 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.045416094465039476, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.045416094465039476 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8290598290598291, "acc_stderr": 0.02466249684520982, "acc_norm": 0.8290598290598291, "acc_norm_stderr": 0.02466249684520982 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7624521072796935, "acc_stderr": 0.015218733046150193, "acc_norm": 0.7624521072796935, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6069364161849711, "acc_stderr": 0.02629622791561367, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.02629622791561367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3027932960893855, "acc_stderr": 0.01536686038639711, "acc_norm": 0.3027932960893855, "acc_norm_stderr": 0.01536686038639711 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6339869281045751, "acc_stderr": 0.02758281141515961, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.02758281141515961 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.02698147804364804, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.02698147804364804 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6080246913580247, "acc_stderr": 0.027163686038271146, "acc_norm": 0.6080246913580247, "acc_norm_stderr": 0.027163686038271146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41134751773049644, "acc_stderr": 0.02935491115994098, "acc_norm": 0.41134751773049644, "acc_norm_stderr": 0.02935491115994098 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4517601043024772, "acc_stderr": 0.012710662233660247, "acc_norm": 0.4517601043024772, "acc_norm_stderr": 0.012710662233660247 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5718954248366013, "acc_stderr": 0.0200176292142131, "acc_norm": 0.5718954248366013, "acc_norm_stderr": 0.0200176292142131 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5818181818181818, "acc_stderr": 0.04724577405731572, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.04724577405731572 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6326530612244898, "acc_stderr": 0.030862144921087548, "acc_norm": 0.6326530612244898, "acc_norm_stderr": 0.030862144921087548 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.2913096695226438, "mc1_stderr": 0.015905987048184828, "mc2": 0.42517178573631115, "mc2_stderr": 0.01461529390566251 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,938
[ [ -0.047760009765625, -0.05987548828125, 0.0179901123046875, 0.0158233642578125, -0.01084136962890625, -0.004398345947265625, -0.0014657974243164062, -0.0137481689453125, 0.04071044921875, -0.0018749237060546875, -0.03350830078125, -0.048858642578125, -0.031555175...
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13
2023-10-24T22:04:58.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T08:32:31
--- pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [OpenBuddy/openbuddy-mistral-7b-v13](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T22:04:44.332803](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13/blob/main/results_2023-10-24T22-04-44.332803.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29750419463087246,\n\ \ \"em_stderr\": 0.004681748345750226,\n \"f1\": 0.3555442533557056,\n\ \ \"f1_stderr\": 0.004616201496073195,\n \"acc\": 0.4322619501392136,\n\ \ \"acc_stderr\": 0.011205063255665634\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.29750419463087246,\n \"em_stderr\": 0.004681748345750226,\n\ \ \"f1\": 0.3555442533557056,\n \"f1_stderr\": 0.004616201496073195\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \ \ \"acc_stderr\": 0.00975606366035987\n },\n \"harness|winogrande|5\":\ \ {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.012654062850971398\n\ \ }\n}\n```" repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|arc:challenge|25_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T08-32-08.394718.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T22_04_44.332803 path: - '**/details_harness|drop|3_2023-10-24T22-04-44.332803.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T22-04-44.332803.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T22_04_44.332803 path: - '**/details_harness|gsm8k|5_2023-10-24T22-04-44.332803.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T22-04-44.332803.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hellaswag|10_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T08-32-08.394718.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T08_32_08.394718 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-32-08.394718.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T08-32-08.394718.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T22_04_44.332803 path: - '**/details_harness|winogrande|5_2023-10-24T22-04-44.332803.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T22-04-44.332803.parquet' - config_name: results data_files: - split: 2023_10_10T08_32_08.394718 path: - results_2023-10-10T08-32-08.394718.parquet - split: 2023_10_24T22_04_44.332803 path: - results_2023-10-24T22-04-44.332803.parquet - split: latest path: - results_2023-10-24T22-04-44.332803.parquet --- # Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v13](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T22:04:44.332803](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13/blob/main/results_2023-10-24T22-04-44.332803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.29750419463087246, "em_stderr": 0.004681748345750226, "f1": 0.3555442533557056, "f1_stderr": 0.004616201496073195, "acc": 0.4322619501392136, "acc_stderr": 0.011205063255665634 }, "harness|drop|3": { "em": 0.29750419463087246, "em_stderr": 0.004681748345750226, "f1": 0.3555442533557056, "f1_stderr": 0.004616201496073195 }, "harness|gsm8k|5": { "acc": 0.1470811220621683, "acc_stderr": 0.00975606366035987 }, "harness|winogrande|5": { "acc": 0.7174427782162589, "acc_stderr": 0.012654062850971398 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,713
[ [ -0.029022216796875, -0.052978515625, 0.0087127685546875, 0.01654052734375, -0.00827789306640625, -0.0017185211181640625, -0.026580810546875, -0.0112457275390625, 0.0194549560546875, 0.0369873046875, -0.0386962890625, -0.06768798828125, -0.043853759765625, 0....
danieletdg/eCommerceQuery
2023-10-10T08:38:25.000Z
[ "region:us" ]
danieletdg
null
null
1
0
2023-10-10T08:38:19
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: text dtype: string - name: entities dtype: string splits: - name: train num_bytes: 398956 num_examples: 3994 - name: test num_bytes: 1597728 num_examples: 15980 download_size: 1007526 dataset_size: 1996684 --- # Dataset Card for "eCommerceQuery" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
578
[ [ -0.039276123046875, -0.0259552001953125, 0.00925445556640625, 0.00272369384765625, -0.00826263427734375, 0.0007047653198242188, 0.0188446044921875, -0.03076171875, 0.054443359375, 0.041107177734375, -0.07562255859375, -0.05950927734375, -0.0154571533203125, ...
c0m/123
2023-10-10T09:19:40.000Z
[ "region:us" ]
c0m
null
null
0
0
2023-10-10T09:19:40
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
temasarkisov/EsportLogosV2_processed_V2
2023-10-10T09:28:56.000Z
[ "region:us" ]
temasarkisov
null
null
0
0
2023-10-10T09:28:53
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 4561815.0 num_examples: 73 download_size: 4560462 dataset_size: 4561815.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "EsportLogosV2_processed_V2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
489
[ [ -0.0190277099609375, -0.0156707763671875, 0.01074981689453125, 0.0208740234375, -0.02581787109375, 0.0034027099609375, 0.0177459716796875, -0.0257415771484375, 0.06146240234375, 0.0465087890625, -0.0687255859375, -0.04412841796875, -0.046295166015625, -0.018...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1
2023-10-25T21:38:13.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:30:57
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T21:38:01.231208](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1/blob/main/results_2023-10-25T21-38-01.231208.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22818791946308725,\n\ \ \"em_stderr\": 0.00429775606227976,\n \"f1\": 0.2705872483221472,\n\ \ \"f1_stderr\": 0.004287875673448546,\n \"acc\": 0.45044049897886096,\n\ \ \"acc_stderr\": 0.010454670771991827\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.22818791946308725,\n \"em_stderr\": 0.00429775606227976,\n\ \ \"f1\": 0.2705872483221472,\n \"f1_stderr\": 0.004287875673448546\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12661106899166036,\n \ \ \"acc_stderr\": 0.009159715283081099\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902557\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-30-33.515075.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T21_38_01.231208 path: - '**/details_harness|drop|3_2023-10-25T21-38-01.231208.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T21-38-01.231208.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T21_38_01.231208 path: - '**/details_harness|gsm8k|5_2023-10-25T21-38-01.231208.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T21-38-01.231208.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hellaswag|10_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-30-33.515075.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_30_33.515075 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-30-33.515075.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-30-33.515075.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T21_38_01.231208 path: - '**/details_harness|winogrande|5_2023-10-25T21-38-01.231208.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T21-38-01.231208.parquet' - config_name: results data_files: - split: 2023_10_10T09_30_33.515075 path: - results_2023-10-10T09-30-33.515075.parquet - split: 2023_10_25T21_38_01.231208 path: - results_2023-10-25T21-38-01.231208.parquet - split: latest path: - results_2023-10-25T21-38-01.231208.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T21:38:01.231208](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down-test1/blob/main/results_2023-10-25T21-38-01.231208.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.22818791946308725, "em_stderr": 0.00429775606227976, "f1": 0.2705872483221472, "f1_stderr": 0.004287875673448546, "acc": 0.45044049897886096, "acc_stderr": 0.010454670771991827 }, "harness|drop|3": { "em": 0.22818791946308725, "em_stderr": 0.00429775606227976, "f1": 0.2705872483221472, "f1_stderr": 0.004287875673448546 }, "harness|gsm8k|5": { "acc": 0.12661106899166036, "acc_stderr": 0.009159715283081099 }, "harness|winogrande|5": { "acc": 0.7742699289660616, "acc_stderr": 0.011749626260902557 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,014
[ [ -0.029266357421875, -0.056732177734375, 0.0169830322265625, 0.01910400390625, -0.01323699951171875, 0.01032257080078125, -0.0247039794921875, -0.019744873046875, 0.032073974609375, 0.037811279296875, -0.051849365234375, -0.06500244140625, -0.052459716796875, ...
open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0
2023-10-28T16:31:20.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:36:03
--- pretty_name: Evaluation run of uukuguy/speechless-code-mistral-7b-v1.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [uukuguy/speechless-code-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T16:31:08.459023](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0/blob/main/results_2023-10-28T16-31-08.459023.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1579278523489933,\n\ \ \"em_stderr\": 0.003734596341987714,\n \"f1\": 0.21190331375838886,\n\ \ \"f1_stderr\": 0.0037546108265308093,\n \"acc\": 0.48935508173001835,\n\ \ \"acc_stderr\": 0.011177063823008385\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.1579278523489933,\n \"em_stderr\": 0.003734596341987714,\n\ \ \"f1\": 0.21190331375838886,\n \"f1_stderr\": 0.0037546108265308093\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \ \ \"acc_stderr\": 0.01084516995529402\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072275\n\ \ }\n}\n```" repo_url: https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-35-40.611521.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T16_31_08.459023 path: - '**/details_harness|drop|3_2023-10-28T16-31-08.459023.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T16-31-08.459023.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T16_31_08.459023 path: - '**/details_harness|gsm8k|5_2023-10-28T16-31-08.459023.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T16-31-08.459023.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hellaswag|10_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-40.611521.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_35_40.611521 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-40.611521.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-40.611521.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T16_31_08.459023 path: - '**/details_harness|winogrande|5_2023-10-28T16-31-08.459023.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T16-31-08.459023.parquet' - config_name: results data_files: - split: 2023_10_10T09_35_40.611521 path: - results_2023-10-10T09-35-40.611521.parquet - split: 2023_10_28T16_31_08.459023 path: - results_2023-10-28T16-31-08.459023.parquet - split: latest path: - results_2023-10-28T16-31-08.459023.parquet --- # Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-code-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:31:08.459023](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v1.0/blob/main/results_2023-10-28T16-31-08.459023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1579278523489933, "em_stderr": 0.003734596341987714, "f1": 0.21190331375838886, "f1_stderr": 0.0037546108265308093, "acc": 0.48935508173001835, "acc_stderr": 0.011177063823008385 }, "harness|drop|3": { "em": 0.1579278523489933, "em_stderr": 0.003734596341987714, "f1": 0.21190331375838886, "f1_stderr": 0.0037546108265308093 }, "harness|gsm8k|5": { "acc": 0.19181197877179681, "acc_stderr": 0.01084516995529402 }, "harness|winogrande|5": { "acc": 0.7868981846882399, "acc_stderr": 0.01150895769072275 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,778
[ [ -0.025634765625, -0.042938232421875, 0.01323699951171875, 0.0225830078125, -0.0142059326171875, 0.002277374267578125, -0.02349853515625, -0.0123443603515625, 0.0229949951171875, 0.0445556640625, -0.0450439453125, -0.06634521484375, -0.04449462890625, 0.01064...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down
2023-10-25T22:16:28.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:36:18
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T22:16:15.844961](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-25T22-16-15.844961.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.044148489932885907,\n\ \ \"em_stderr\": 0.0021037435299994067,\n \"f1\": 0.09857802013422778,\n\ \ \"f1_stderr\": 0.0023775705231284467,\n \"acc\": 0.4617194030779578,\n\ \ \"acc_stderr\": 0.010887835734442838\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.044148489932885907,\n \"em_stderr\": 0.0021037435299994067,\n\ \ \"f1\": 0.09857802013422778,\n \"f1_stderr\": 0.0023775705231284467\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15390447308567096,\n \ \ \"acc_stderr\": 0.009939799304049013\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836666\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-35-55.043179.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T22_16_15.844961 path: - '**/details_harness|drop|3_2023-10-25T22-16-15.844961.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T22-16-15.844961.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T22_16_15.844961 path: - '**/details_harness|gsm8k|5_2023-10-25T22-16-15.844961.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T22-16-15.844961.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hellaswag|10_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-35-55.043179.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_35_55.043179 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-55.043179.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-35-55.043179.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T22_16_15.844961 path: - '**/details_harness|winogrande|5_2023-10-25T22-16-15.844961.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T22-16-15.844961.parquet' - config_name: results data_files: - split: 2023_10_10T09_35_55.043179 path: - results_2023-10-10T09-35-55.043179.parquet - split: 2023_10_25T22_16_15.844961 path: - results_2023-10-25T22-16-15.844961.parquet - split: latest path: - results_2023-10-25T22-16-15.844961.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T22:16:15.844961](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_addto15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-25T22-16-15.844961.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.044148489932885907, "em_stderr": 0.0021037435299994067, "f1": 0.09857802013422778, "f1_stderr": 0.0023775705231284467, "acc": 0.4617194030779578, "acc_stderr": 0.010887835734442838 }, "harness|drop|3": { "em": 0.044148489932885907, "em_stderr": 0.0021037435299994067, "f1": 0.09857802013422778, "f1_stderr": 0.0023775705231284467 }, "harness|gsm8k|5": { "acc": 0.15390447308567096, "acc_stderr": 0.009939799304049013 }, "harness|winogrande|5": { "acc": 0.7695343330702447, "acc_stderr": 0.011835872164836666 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,068
[ [ -0.0294342041015625, -0.053863525390625, 0.0186309814453125, 0.01971435546875, -0.01488494873046875, 0.01195526123046875, -0.0260009765625, -0.020538330078125, 0.03326416015625, 0.038299560546875, -0.052276611328125, -0.06610107421875, -0.053924560546875, 0....
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down
2023-10-28T17:14:26.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:43:07
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T17:14:13.466730](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-28T17-14-13.466730.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009123322147651007,\n\ \ \"em_stderr\": 0.000973701770554167,\n \"f1\": 0.06916421979865739,\n\ \ \"f1_stderr\": 0.00161270274465004,\n \"acc\": 0.43659154378391707,\n\ \ \"acc_stderr\": 0.01026195907539337\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.009123322147651007,\n \"em_stderr\": 0.000973701770554167,\n\ \ \"f1\": 0.06916421979865739,\n \"f1_stderr\": 0.00161270274465004\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \ \ \"acc_stderr\": 0.008590089300511116\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-42-44.126959.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T17_14_13.466730 path: - '**/details_harness|drop|3_2023-10-28T17-14-13.466730.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T17-14-13.466730.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T17_14_13.466730 path: - '**/details_harness|gsm8k|5_2023-10-28T17-14-13.466730.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T17-14-13.466730.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hellaswag|10_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-42-44.126959.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_42_44.126959 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-42-44.126959.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-42-44.126959.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T17_14_13.466730 path: - '**/details_harness|winogrande|5_2023-10-28T17-14-13.466730.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T17-14-13.466730.parquet' - config_name: results data_files: - split: 2023_10_10T09_42_44.126959 path: - results_2023-10-10T09-42-44.126959.parquet - split: 2023_10_28T17_14_13.466730 path: - results_2023-10-28T17-14-13.466730.parquet - split: latest path: - results_2023-10-28T17-14-13.466730.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T17:14:13.466730](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_compare15k_4.5w-r16-gate_up_down/blob/main/results_2023-10-28T17-14-13.466730.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.009123322147651007, "em_stderr": 0.000973701770554167, "f1": 0.06916421979865739, "f1_stderr": 0.00161270274465004, "acc": 0.43659154378391707, "acc_stderr": 0.01026195907539337 }, "harness|drop|3": { "em": 0.009123322147651007, "em_stderr": 0.000973701770554167, "f1": 0.06916421979865739, "f1_stderr": 0.00161270274465004 }, "harness|gsm8k|5": { "acc": 0.10917361637604246, "acc_stderr": 0.008590089300511116 }, "harness|winogrande|5": { "acc": 0.7640094711917916, "acc_stderr": 0.011933828850275625 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,080
[ [ -0.0290985107421875, -0.053924560546875, 0.017333984375, 0.018829345703125, -0.014068603515625, 0.01183319091796875, -0.0256195068359375, -0.020721435546875, 0.03271484375, 0.03802490234375, -0.05108642578125, -0.06689453125, -0.05328369140625, 0.01770019531...
Jagadeesh-ti/hr_v2
2023-10-10T09:48:49.000Z
[ "region:us" ]
Jagadeesh-ti
null
null
0
0
2023-10-10T09:48:20
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o
2023-10-28T03:33:04.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:49:16
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T03:32:51.454817](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-28T03-32-51.454817.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37258808724832215,\n\ \ \"em_stderr\": 0.004951428522573584,\n \"f1\": 0.41863255033557134,\n\ \ \"f1_stderr\": 0.004838761301543826,\n \"acc\": 0.4445987937813739,\n\ \ \"acc_stderr\": 0.010466651540029098\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.37258808724832215,\n \"em_stderr\": 0.004951428522573584,\n\ \ \"f1\": 0.41863255033557134,\n \"f1_stderr\": 0.004838761301543826\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \ \ \"acc_stderr\": 0.009041108602874675\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-48-52.263585.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T03_32_51.454817 path: - '**/details_harness|drop|3_2023-10-28T03-32-51.454817.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T03-32-51.454817.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T03_32_51.454817 path: - '**/details_harness|gsm8k|5_2023-10-28T03-32-51.454817.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T03-32-51.454817.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-48-52.263585.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_48_52.263585 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-48-52.263585.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T03_32_51.454817 path: - '**/details_harness|winogrande|5_2023-10-28T03-32-51.454817.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T03-32-51.454817.parquet' - config_name: results data_files: - split: 2023_10_10T09_48_52.263585 path: - results_2023-10-10T09-48-52.263585.parquet - split: 2023_10_28T03_32_51.454817 path: - results_2023-10-28T03-32-51.454817.parquet - split: latest path: - results_2023-10-28T03-32-51.454817.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T03:32:51.454817](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o/blob/main/results_2023-10-28T03-32-51.454817.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.37258808724832215, "em_stderr": 0.004951428522573584, "f1": 0.41863255033557134, "f1_stderr": 0.004838761301543826, "acc": 0.4445987937813739, "acc_stderr": 0.010466651540029098 }, "harness|drop|3": { "em": 0.37258808724832215, "em_stderr": 0.004951428522573584, "f1": 0.41863255033557134, "f1_stderr": 0.004838761301543826 }, "harness|gsm8k|5": { "acc": 0.12282031842304776, "acc_stderr": 0.009041108602874675 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183524 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,852
[ [ -0.02783203125, -0.053314208984375, 0.0167999267578125, 0.0200958251953125, -0.016693115234375, 0.01256561279296875, -0.0268707275390625, -0.0211334228515625, 0.033477783203125, 0.03875732421875, -0.050933837890625, -0.0662841796875, -0.05230712890625, 0.018...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o
2023-10-10T09:57:03.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T09:56:03
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T09:55:39.074089](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o/blob/main/results_2023-10-10T09-55-39.074089.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5579208482981199,\n\ \ \"acc_stderr\": 0.03451179922986214,\n \"acc_norm\": 0.561910682652304,\n\ \ \"acc_norm_stderr\": 0.034493014919848415,\n \"mc1\": 0.27906976744186046,\n\ \ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4152969798879882,\n\ \ \"mc2_stderr\": 0.014212723478778425\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924068,\n\ \ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650652\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6117307309300936,\n\ \ \"acc_stderr\": 0.004863603638367452,\n \"acc_norm\": 0.8172674765982872,\n\ \ \"acc_norm_stderr\": 0.00385657294683102\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\ \ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\ \ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\ \ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.02983280811479601,\n\ \ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.02983280811479601\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\ \ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\ \ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\ \ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\ \ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\ \ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\ \ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\ acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\ \ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\ \ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\ \ \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n\ \ \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\ \ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\ : 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\ \ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\ acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011747,\n\ \ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011747\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n\ \ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \ \ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\ acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502327,\n \"\ acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502327\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"\ acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \ \ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\ \ \"acc_stderr\": 0.03219079200419994,\n \"acc_norm\": 0.6412556053811659,\n\ \ \"acc_norm_stderr\": 0.03219079200419994\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\ \ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\ \ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\ \ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\ \ \"acc_stderr\": 0.026453508054040314,\n \"acc_norm\": 0.7948717948717948,\n\ \ \"acc_norm_stderr\": 0.026453508054040314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \ \ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\ \ \"acc_stderr\": 0.015491088951494578,\n \"acc_norm\": 0.7496807151979565,\n\ \ \"acc_norm_stderr\": 0.015491088951494578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n\ \ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\ \ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\ \ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\ \ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\ \ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\ \ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\ \ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\ \ \"acc_stderr\": 0.012604960816087378,\n \"acc_norm\": 0.4198174706649283,\n\ \ \"acc_norm_stderr\": 0.012604960816087378\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n\ \ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \ \ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n\ \ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720685\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\ \ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\ \ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\ \ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\ \ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4152969798879882,\n\ \ \"mc2_stderr\": 0.014212723478778425\n }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|arc:challenge|25_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hellaswag|10_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T09-55-39.074089.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T09_55_39.074089 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-55-39.074089.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T09-55-39.074089.parquet' - config_name: results data_files: - split: 2023_10_10T09_55_39.074089 path: - results_2023-10-10T09-55-39.074089.parquet - split: latest path: - results_2023-10-10T09-55-39.074089.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T09:55:39.074089](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o/blob/main/results_2023-10-10T09-55-39.074089.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5579208482981199, "acc_stderr": 0.03451179922986214, "acc_norm": 0.561910682652304, "acc_norm_stderr": 0.034493014919848415, "mc1": 0.27906976744186046, "mc1_stderr": 0.015702107090627908, "mc2": 0.4152969798879882, "mc2_stderr": 0.014212723478778425 }, "harness|arc:challenge|25": { "acc": 0.5426621160409556, "acc_stderr": 0.014558106543924068, "acc_norm": 0.5725255972696246, "acc_norm_stderr": 0.014456862944650652 }, "harness|hellaswag|10": { "acc": 0.6117307309300936, "acc_stderr": 0.004863603638367452, "acc_norm": 0.8172674765982872, "acc_norm_stderr": 0.00385657294683102 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5259259259259259, "acc_stderr": 0.04313531696750575, "acc_norm": 0.5259259259259259, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.040601270352363966, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.02983280811479601, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.02983280811479601 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842425, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842425 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5317919075144508, "acc_stderr": 0.03804749744364764, "acc_norm": 0.5317919075144508, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.043391383225798615, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.043391383225798615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3492063492063492, "acc_stderr": 0.024552292209342658, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.024552292209342658 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.040735243221471255, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.040735243221471255 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091706, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091706 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.03242497958178816, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.03242497958178816 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8134715025906736, "acc_stderr": 0.02811209121011747, "acc_norm": 0.8134715025906736, "acc_norm_stderr": 0.02811209121011747 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5512820512820513, "acc_stderr": 0.025217315184846482, "acc_norm": 0.5512820512820513, "acc_norm_stderr": 0.025217315184846482 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413925, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413925 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7522935779816514, "acc_stderr": 0.018508143602547832, "acc_norm": 0.7522935779816514, "acc_norm_stderr": 0.018508143602547832 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502327, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502327 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145638, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145638 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.03219079200419994, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.03219079200419994 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.04284467968052194, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.04284467968052194 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6687116564417178, "acc_stderr": 0.03697983910025588, "acc_norm": 0.6687116564417178, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.0432704093257873, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.0432704093257873 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7948717948717948, "acc_stderr": 0.026453508054040314, "acc_norm": 0.7948717948717948, "acc_norm_stderr": 0.026453508054040314 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7496807151979565, "acc_stderr": 0.015491088951494578, "acc_norm": 0.7496807151979565, "acc_norm_stderr": 0.015491088951494578 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6040462427745664, "acc_stderr": 0.026329813341946243, "acc_norm": 0.6040462427745664, "acc_norm_stderr": 0.026329813341946243 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33854748603351953, "acc_stderr": 0.01582670009648135, "acc_norm": 0.33854748603351953, "acc_norm_stderr": 0.01582670009648135 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5980392156862745, "acc_stderr": 0.02807415894760065, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.02807415894760065 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.662379421221865, "acc_stderr": 0.026858825879488544, "acc_norm": 0.662379421221865, "acc_norm_stderr": 0.026858825879488544 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132143, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132143 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4198174706649283, "acc_stderr": 0.012604960816087378, "acc_norm": 0.4198174706649283, "acc_norm_stderr": 0.012604960816087378 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5735294117647058, "acc_stderr": 0.03004261583271486, "acc_norm": 0.5735294117647058, "acc_norm_stderr": 0.03004261583271486 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5669934640522876, "acc_stderr": 0.020045442473324227, "acc_norm": 0.5669934640522876, "acc_norm_stderr": 0.020045442473324227 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.563265306122449, "acc_stderr": 0.031751952375833226, "acc_norm": 0.563265306122449, "acc_norm_stderr": 0.031751952375833226 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458618, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.71, "acc_stderr": 0.04560480215720685, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720685 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479636, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479636 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.031267817146631786, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.031267817146631786 }, "harness|truthfulqa:mc|0": { "mc1": 0.27906976744186046, "mc1_stderr": 0.015702107090627908, "mc2": 0.4152969798879882, "mc2_stderr": 0.014212723478778425 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,184
[ [ -0.048675537109375, -0.06134033203125, 0.0199737548828125, 0.01554107666015625, -0.01348876953125, -0.0024127960205078125, -0.000019788742065429688, -0.0180511474609375, 0.04095458984375, -0.0030612945556640625, -0.034576416015625, -0.0472412109375, -0.032073974...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down
2023-10-29T07:06:39.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T10:01:42
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-29T07:06:26.845938](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-29T07-06-26.845938.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20836828859060402,\n\ \ \"em_stderr\": 0.004159269440162747,\n \"f1\": 0.2507906879194633,\n\ \ \"f1_stderr\": 0.004162090421371717,\n \"acc\": 0.43807672814244847,\n\ \ \"acc_stderr\": 0.01035305451841861\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.20836828859060402,\n \"em_stderr\": 0.004159269440162747,\n\ \ \"f1\": 0.2507906879194633,\n \"f1_stderr\": 0.004162090421371717\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \ \ \"acc_stderr\": 0.008744810131034056\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803166\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_29T07_06_26.845938 path: - '**/details_harness|drop|3_2023-10-29T07-06-26.845938.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-29T07-06-26.845938.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_29T07_06_26.845938 path: - '**/details_harness|gsm8k|5_2023-10-29T07-06-26.845938.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-29T07-06-26.845938.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T10_01_17.783068 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_29T07_06_26.845938 path: - '**/details_harness|winogrande|5_2023-10-29T07-06-26.845938.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-29T07-06-26.845938.parquet' - config_name: results data_files: - split: 2023_10_10T10_01_17.783068 path: - results_2023-10-10T10-01-17.783068.parquet - split: 2023_10_29T07_06_26.845938 path: - results_2023-10-29T07-06-26.845938.parquet - split: latest path: - results_2023-10-29T07-06-26.845938.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T07:06:26.845938](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-29T07-06-26.845938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.20836828859060402, "em_stderr": 0.004159269440162747, "f1": 0.2507906879194633, "f1_stderr": 0.004162090421371717, "acc": 0.43807672814244847, "acc_stderr": 0.01035305451841861 }, "harness|drop|3": { "em": 0.20836828859060402, "em_stderr": 0.004159269440162747, "f1": 0.2507906879194633, "f1_stderr": 0.004162090421371717 }, "harness|gsm8k|5": { "acc": 0.11372251705837756, "acc_stderr": 0.008744810131034056 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803166 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,920
[ [ -0.0276947021484375, -0.053985595703125, 0.0184478759765625, 0.0186920166015625, -0.01360321044921875, 0.01236724853515625, -0.0259246826171875, -0.020294189453125, 0.03253173828125, 0.03851318359375, -0.052490234375, -0.06707763671875, -0.053802490234375, 0...
open-llm-leaderboard/details_elinas__chronos007-70b
2023-10-10T10:10:12.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T10:09:14
--- pretty_name: Evaluation run of elinas/chronos007-70b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos007-70b\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T10:08:50.772021](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b/blob/main/results_2023-10-10T10-08-50.772021.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6924704612932385,\n\ \ \"acc_stderr\": 0.031262676706071496,\n \"acc_norm\": 0.6964780207046983,\n\ \ \"acc_norm_stderr\": 0.03123103152479671,\n \"mc1\": 0.41370869033047736,\n\ \ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5765003665263857,\n\ \ \"mc2_stderr\": 0.0150600091771299\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620453,\n\ \ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.01337407861506874\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6874128659629556,\n\ \ \"acc_stderr\": 0.004626002828389176,\n \"acc_norm\": 0.8752240589524,\n\ \ \"acc_norm_stderr\": 0.003297893047728379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\ \ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\ \ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \ \ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\ \ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\ \ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\ \ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\ \ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\ \ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400492,\n \"\ acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400492\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n\ \ \"acc_stderr\": 0.021090847745939306,\n \"acc_norm\": 0.8354838709677419,\n\ \ \"acc_norm_stderr\": 0.021090847745939306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\ \ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\ acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\ \ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\ \ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.02788682807838055,\n \ \ \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.02788682807838055\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\ acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8825688073394495,\n \"acc_stderr\": 0.01380278022737736,\n \"\ acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.01380278022737736\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\ acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\ acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \ \ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\ \ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\ \ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515368,\n\ \ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515368\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\ acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\ \ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\ \ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\ \ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\ \ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\ \ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\ \ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8480204342273308,\n\ \ \"acc_stderr\": 0.012837852506645216,\n \"acc_norm\": 0.8480204342273308,\n\ \ \"acc_norm_stderr\": 0.012837852506645216\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\ \ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5508379888268157,\n\ \ \"acc_stderr\": 0.01663583834163193,\n \"acc_norm\": 0.5508379888268157,\n\ \ \"acc_norm_stderr\": 0.01663583834163193\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n\ \ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\ \ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\ \ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396154,\n\ \ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396154\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236834,\n \ \ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236834\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5417209908735332,\n\ \ \"acc_stderr\": 0.012725701656953642,\n \"acc_norm\": 0.5417209908735332,\n\ \ \"acc_norm_stderr\": 0.012725701656953642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\ \ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n\ \ \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n\ \ \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n\ \ \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n\ \ \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n\ \ \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n\ \ \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n\ \ \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n\ \ \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n\ \ \"mc2\": 0.5765003665263857,\n \"mc2_stderr\": 0.0150600091771299\n\ \ }\n}\n```" repo_url: https://huggingface.co/elinas/chronos007-70b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|arc:challenge|25_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hellaswag|10_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-08-50.772021.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T10_08_50.772021 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-08-50.772021.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-08-50.772021.parquet' - config_name: results data_files: - split: 2023_10_10T10_08_50.772021 path: - results_2023-10-10T10-08-50.772021.parquet - split: latest path: - results_2023-10-10T10-08-50.772021.parquet --- # Dataset Card for Evaluation run of elinas/chronos007-70b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/elinas/chronos007-70b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_elinas__chronos007-70b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T10:08:50.772021](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b/blob/main/results_2023-10-10T10-08-50.772021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6924704612932385, "acc_stderr": 0.031262676706071496, "acc_norm": 0.6964780207046983, "acc_norm_stderr": 0.03123103152479671, "mc1": 0.41370869033047736, "mc1_stderr": 0.0172408618120998, "mc2": 0.5765003665263857, "mc2_stderr": 0.0150600091771299 }, "harness|arc:challenge|25": { "acc": 0.6527303754266212, "acc_stderr": 0.013913034529620453, "acc_norm": 0.7013651877133106, "acc_norm_stderr": 0.01337407861506874 }, "harness|hellaswag|10": { "acc": 0.6874128659629556, "acc_stderr": 0.004626002828389176, "acc_norm": 0.8752240589524, "acc_norm_stderr": 0.003297893047728379 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7828947368421053, "acc_stderr": 0.03355045304882924, "acc_norm": 0.7828947368421053, "acc_norm_stderr": 0.03355045304882924 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8333333333333334, "acc_stderr": 0.031164899666948617, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.031164899666948617 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6212765957446809, "acc_stderr": 0.03170995606040655, "acc_norm": 0.6212765957446809, "acc_norm_stderr": 0.03170995606040655 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.040703290137070705, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.040703290137070705 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.025542846817400492, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.025542846817400492 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8354838709677419, "acc_stderr": 0.021090847745939306, "acc_norm": 0.8354838709677419, "acc_norm_stderr": 0.021090847745939306 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5467980295566502, "acc_stderr": 0.03502544650845872, "acc_norm": 0.5467980295566502, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.02888787239548795, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.02888787239548795 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8636363636363636, "acc_stderr": 0.024450155973189835, "acc_norm": 0.8636363636363636, "acc_norm_stderr": 0.024450155973189835 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6974358974358974, "acc_stderr": 0.023290888053772725, "acc_norm": 0.6974358974358974, "acc_norm_stderr": 0.023290888053772725 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652458, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652458 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7563025210084033, "acc_stderr": 0.02788682807838055, "acc_norm": 0.7563025210084033, "acc_norm_stderr": 0.02788682807838055 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5033112582781457, "acc_stderr": 0.04082393379449654, "acc_norm": 0.5033112582781457, "acc_norm_stderr": 0.04082393379449654 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8825688073394495, "acc_stderr": 0.01380278022737736, "acc_norm": 0.8825688073394495, "acc_norm_stderr": 0.01380278022737736 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9117647058823529, "acc_stderr": 0.019907399791316945, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.019907399791316945 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.869198312236287, "acc_stderr": 0.02194876605947076, "acc_norm": 0.869198312236287, "acc_norm_stderr": 0.02194876605947076 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7757847533632287, "acc_stderr": 0.027991534258519513, "acc_norm": 0.7757847533632287, "acc_norm_stderr": 0.027991534258519513 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8549618320610687, "acc_stderr": 0.030884661089515368, "acc_norm": 0.8549618320610687, "acc_norm_stderr": 0.030884661089515368 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8842975206611571, "acc_stderr": 0.029199802455622814, "acc_norm": 0.8842975206611571, "acc_norm_stderr": 0.029199802455622814 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.03157065078911901, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.03157065078911901 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9145299145299145, "acc_stderr": 0.01831589168562585, "acc_norm": 0.9145299145299145, "acc_norm_stderr": 0.01831589168562585 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8480204342273308, "acc_stderr": 0.012837852506645216, "acc_norm": 0.8480204342273308, "acc_norm_stderr": 0.012837852506645216 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7803468208092486, "acc_stderr": 0.022289638852617893, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.022289638852617893 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5508379888268157, "acc_stderr": 0.01663583834163193, "acc_norm": 0.5508379888268157, "acc_norm_stderr": 0.01663583834163193 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958154, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7491961414790996, "acc_stderr": 0.024619771956697168, "acc_norm": 0.7491961414790996, "acc_norm_stderr": 0.024619771956697168 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.808641975308642, "acc_stderr": 0.021887704613396154, "acc_norm": 0.808641975308642, "acc_norm_stderr": 0.021887704613396154 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5425531914893617, "acc_stderr": 0.029719281272236834, "acc_norm": 0.5425531914893617, "acc_norm_stderr": 0.029719281272236834 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5417209908735332, "acc_stderr": 0.012725701656953642, "acc_norm": 0.5417209908735332, "acc_norm_stderr": 0.012725701656953642 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7169117647058824, "acc_stderr": 0.02736586113151381, "acc_norm": 0.7169117647058824, "acc_norm_stderr": 0.02736586113151381 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.75, "acc_stderr": 0.01751781884501444, "acc_norm": 0.75, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7545454545454545, "acc_stderr": 0.041220665028782855, "acc_norm": 0.7545454545454545, "acc_norm_stderr": 0.041220665028782855 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7877551020408163, "acc_stderr": 0.026176967197866764, "acc_norm": 0.7877551020408163, "acc_norm_stderr": 0.026176967197866764 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.41370869033047736, "mc1_stderr": 0.0172408618120998, "mc2": 0.5765003665263857, "mc2_stderr": 0.0150600091771299 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,811
[ [ -0.049560546875, -0.059478759765625, 0.0180816650390625, 0.0124053955078125, -0.0121612548828125, -0.00482940673828125, 0.0008072853088378906, -0.0167999267578125, 0.041259765625, -0.004123687744140625, -0.033355712890625, -0.047760009765625, -0.02960205078125, ...
giuseppemartino/i-SAID_custom
2023-10-10T15:43:47.000Z
[ "region:us" ]
giuseppemartino
null
null
0
0
2023-10-10T10:11:38
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: train num_bytes: 6362576122.0 num_examples: 840 - name: validation num_bytes: 905977299.0 num_examples: 99 download_size: 7262651438 dataset_size: 7268553421.0 --- # Dataset Card for "i-SAID_custom" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
605
[ [ -0.061187744140625, -0.0282135009765625, 0.01438140869140625, 0.01105499267578125, -0.01065826416015625, 0.0074920654296875, 0.00399017333984375, -0.015716552734375, 0.08551025390625, 0.0367431640625, -0.0679931640625, -0.059234619140625, -0.0345458984375, -...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
2023-10-25T21:40:28.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T10:21:06
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T21:40:15.944875](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-40-15.944875.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07697147651006711,\n\ \ \"em_stderr\": 0.002729682408788614,\n \"f1\": 0.12191170302013389,\n\ \ \"f1_stderr\": 0.0028589398116221384,\n \"acc\": 0.44546584943629414,\n\ \ \"acc_stderr\": 0.01035635936441261\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.07697147651006711,\n \"em_stderr\": 0.002729682408788614,\n\ \ \"f1\": 0.12191170302013389,\n \"f1_stderr\": 0.0028589398116221384\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \ \ \"acc_stderr\": 0.00891970291116163\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T21_40_15.944875 path: - '**/details_harness|drop|3_2023-10-25T21-40-15.944875.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T21-40-15.944875.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T21_40_15.944875 path: - '**/details_harness|gsm8k|5_2023-10-25T21-40-15.944875.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T21-40-15.944875.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T10_20_42.158103 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T21_40_15.944875 path: - '**/details_harness|winogrande|5_2023-10-25T21-40-15.944875.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T21-40-15.944875.parquet' - config_name: results data_files: - split: 2023_10_10T10_20_42.158103 path: - results_2023-10-10T10-20-42.158103.parquet - split: 2023_10_25T21_40_15.944875 path: - results_2023-10-25T21-40-15.944875.parquet - split: latest path: - results_2023-10-25T21-40-15.944875.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T21:40:15.944875](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-40-15.944875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07697147651006711, "em_stderr": 0.002729682408788614, "f1": 0.12191170302013389, "f1_stderr": 0.0028589398116221384, "acc": 0.44546584943629414, "acc_stderr": 0.01035635936441261 }, "harness|drop|3": { "em": 0.07697147651006711, "em_stderr": 0.002729682408788614, "f1": 0.12191170302013389, "f1_stderr": 0.0028589398116221384 }, "harness|gsm8k|5": { "acc": 0.11902956785443518, "acc_stderr": 0.00891970291116163 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.011793015817663592 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,010
[ [ -0.0279541015625, -0.05352783203125, 0.017822265625, 0.0191802978515625, -0.0140228271484375, 0.01194000244140625, -0.0246124267578125, -0.0198211669921875, 0.032257080078125, 0.03936767578125, -0.052276611328125, -0.0660400390625, -0.053497314453125, 0.0170...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o
2023-10-26T00:41:46.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T10:27:28
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T00:41:33.977337](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o/blob/main/results_2023-10-26T00-41-33.977337.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.33378775167785235,\n\ \ \"em_stderr\": 0.004829266317241522,\n \"f1\": 0.37629928691275216,\n\ \ \"f1_stderr\": 0.004755605249653425,\n \"acc\": 0.45257492790991716,\n\ \ \"acc_stderr\": 0.010688989200801685\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.33378775167785235,\n \"em_stderr\": 0.004829266317241522,\n\ \ \"f1\": 0.37629928691275216,\n \"f1_stderr\": 0.004755605249653425\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1379833206974981,\n \ \ \"acc_stderr\": 0.009499777327746827\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|arc:challenge|25_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T10-27-05.033674.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T00_41_33.977337 path: - '**/details_harness|drop|3_2023-10-26T00-41-33.977337.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T00-41-33.977337.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T00_41_33.977337 path: - '**/details_harness|gsm8k|5_2023-10-26T00-41-33.977337.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T00-41-33.977337.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hellaswag|10_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T10_27_05.033674 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-27-05.033674.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T10-27-05.033674.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T00_41_33.977337 path: - '**/details_harness|winogrande|5_2023-10-26T00-41-33.977337.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T00-41-33.977337.parquet' - config_name: results data_files: - split: 2023_10_10T10_27_05.033674 path: - results_2023-10-10T10-27-05.033674.parquet - split: 2023_10_26T00_41_33.977337 path: - results_2023-10-26T00-41-33.977337.parquet - split: latest path: - results_2023-10-26T00-41-33.977337.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T00:41:33.977337](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o/blob/main/results_2023-10-26T00-41-33.977337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.33378775167785235, "em_stderr": 0.004829266317241522, "f1": 0.37629928691275216, "f1_stderr": 0.004755605249653425, "acc": 0.45257492790991716, "acc_stderr": 0.010688989200801685 }, "harness|drop|3": { "em": 0.33378775167785235, "em_stderr": 0.004829266317241522, "f1": 0.37629928691275216, "f1_stderr": 0.004755605249653425 }, "harness|gsm8k|5": { "acc": 0.1379833206974981, "acc_stderr": 0.009499777327746827 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856544 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,864
[ [ -0.0275726318359375, -0.053924560546875, 0.0167083740234375, 0.0200653076171875, -0.016845703125, 0.0128631591796875, -0.027099609375, -0.0210113525390625, 0.0333251953125, 0.03851318359375, -0.050811767578125, -0.06610107421875, -0.05224609375, 0.0174102783...
fmeleard/moody_data
2023-10-10T10:37:19.000Z
[ "task_categories:summarization", "task_categories:conversational", "language:fr", "license:apache-2.0", "region:us" ]
fmeleard
null
null
0
0
2023-10-10T10:34:34
--- license: apache-2.0 task_categories: - summarization - conversational language: - fr --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
4,453
[ [ -0.04034423828125, -0.0419921875, 0.00977325439453125, 0.0178070068359375, -0.0300445556640625, -0.0089263916015625, -0.0026721954345703125, -0.048431396484375, 0.043212890625, 0.059478759765625, -0.05938720703125, -0.069580078125, -0.042205810546875, 0.0099...
anhdungitvn/wiki_zh
2023-10-10T11:22:49.000Z
[ "region:us" ]
anhdungitvn
null
null
0
0
2023-10-10T11:17:05
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 2751775710 num_examples: 1375017 download_size: 1736768207 dataset_size: 2751775710 --- # Dataset Card for "wiki_zh" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
513
[ [ -0.048675537109375, -0.01467132568359375, 0.01812744140625, -0.00301361083984375, -0.024261474609375, -0.01629638671875, 0.0106353759765625, -0.0151519775390625, 0.064453125, 0.0279693603515625, -0.07122802734375, -0.056427001953125, -0.03076171875, -0.00815...
yangwang825/sst2-textbugger-5
2023-10-10T11:22:09.000Z
[ "region:us" ]
yangwang825
null
null
0
0
2023-10-10T11:19:25
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.057098388671875, 0.0288543701171875, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.0050506591796875, 0.0513916015625, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.060302734375, 0.03790...
open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k
2023-10-25T07:41:24.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T11:19:36
--- pretty_name: Evaluation run of mncai/Mistral-7B-OpenOrca-1k dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T07:41:12.101153](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-25T07-41-12.101153.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0053481543624161075,\n\ \ \"em_stderr\": 0.0007469252903319289,\n \"f1\": 0.09739828020134218,\n\ \ \"f1_stderr\": 0.001857285751420582,\n \"acc\": 0.45294831833688076,\n\ \ \"acc_stderr\": 0.01023434017882167\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0053481543624161075,\n \"em_stderr\": 0.0007469252903319289,\n\ \ \"f1\": 0.09739828020134218,\n \"f1_stderr\": 0.001857285751420582\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \ \ \"acc_stderr\": 0.008944213403553095\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090247\n\ \ }\n}\n```" repo_url: https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T07_41_12.101153 path: - '**/details_harness|drop|3_2023-10-25T07-41-12.101153.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T07-41-12.101153.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T07_41_12.101153 path: - '**/details_harness|gsm8k|5_2023-10-25T07-41-12.101153.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T07-41-12.101153.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T11_19_13.410150 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T07_41_12.101153 path: - '**/details_harness|winogrande|5_2023-10-25T07-41-12.101153.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T07-41-12.101153.parquet' - config_name: results data_files: - split: 2023_10_10T11_19_13.410150 path: - results_2023-10-10T11-19-13.410150.parquet - split: 2023_10_25T07_41_12.101153 path: - results_2023-10-25T07-41-12.101153.parquet - split: latest path: - results_2023-10-25T07-41-12.101153.parquet --- # Dataset Card for Evaluation run of mncai/Mistral-7B-OpenOrca-1k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T07:41:12.101153](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-25T07-41-12.101153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0053481543624161075, "em_stderr": 0.0007469252903319289, "f1": 0.09739828020134218, "f1_stderr": 0.001857285751420582, "acc": 0.45294831833688076, "acc_stderr": 0.01023434017882167 }, "harness|drop|3": { "em": 0.0053481543624161075, "em_stderr": 0.0007469252903319289, "f1": 0.09739828020134218, "f1_stderr": 0.001857285751420582 }, "harness|gsm8k|5": { "acc": 0.1197877179681577, "acc_stderr": 0.008944213403553095 }, "harness|winogrande|5": { "acc": 0.7861089187056038, "acc_stderr": 0.011524466954090247 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,658
[ [ -0.034393310546875, -0.047576904296875, 0.00771331787109375, 0.018524169921875, -0.011260986328125, -0.0007300376892089844, -0.02557373046875, -0.01324462890625, 0.034393310546875, 0.042816162109375, -0.049896240234375, -0.07427978515625, -0.045623779296875, ...
yangwang825/sst2-pwws-5
2023-10-10T11:27:22.000Z
[ "region:us" ]
yangwang825
null
null
0
0
2023-10-10T11:22:21
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.057098388671875, 0.0288543701171875, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.0050506591796875, 0.0513916015625, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.060302734375, 0.03790...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down
2023-10-28T04:49:35.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T11:25:25
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T04:49:22.682759](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down/blob/main/results_2023-10-28T04-49-22.682759.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07487416107382551,\n\ \ \"em_stderr\": 0.0026952933607895307,\n \"f1\": 0.12380872483221442,\n\ \ \"f1_stderr\": 0.00285148758396334,\n \"acc\": 0.44832731261215925,\n\ \ \"acc_stderr\": 0.010657041987495935\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.07487416107382551,\n \"em_stderr\": 0.0026952933607895307,\n\ \ \"f1\": 0.12380872483221442,\n \"f1_stderr\": 0.00285148758396334\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.133434420015163,\n \ \ \"acc_stderr\": 0.00936649160978448\n },\n \"harness|winogrande|5\": {\n\ \ \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.01194759236520739\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|arc:challenge|25_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T11-25-01.199069.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T04_49_22.682759 path: - '**/details_harness|drop|3_2023-10-28T04-49-22.682759.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T04-49-22.682759.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T04_49_22.682759 path: - '**/details_harness|gsm8k|5_2023-10-28T04-49-22.682759.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T04-49-22.682759.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hellaswag|10_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T11_25_01.199069 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-25-01.199069.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-25-01.199069.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T04_49_22.682759 path: - '**/details_harness|winogrande|5_2023-10-28T04-49-22.682759.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T04-49-22.682759.parquet' - config_name: results data_files: - split: 2023_10_10T11_25_01.199069 path: - results_2023-10-10T11-25-01.199069.parquet - split: 2023_10_28T04_49_22.682759 path: - results_2023-10-28T04-49-22.682759.parquet - split: latest path: - results_2023-10-28T04-49-22.682759.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T04:49:22.682759](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down/blob/main/results_2023-10-28T04-49-22.682759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07487416107382551, "em_stderr": 0.0026952933607895307, "f1": 0.12380872483221442, "f1_stderr": 0.00285148758396334, "acc": 0.44832731261215925, "acc_stderr": 0.010657041987495935 }, "harness|drop|3": { "em": 0.07487416107382551, "em_stderr": 0.0026952933607895307, "f1": 0.12380872483221442, "f1_stderr": 0.00285148758396334 }, "harness|gsm8k|5": { "acc": 0.133434420015163, "acc_stderr": 0.00936649160978448 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.01194759236520739 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,907
[ [ -0.028045654296875, -0.053009033203125, 0.018157958984375, 0.019927978515625, -0.0135040283203125, 0.012115478515625, -0.025115966796875, -0.0195465087890625, 0.03192138671875, 0.038818359375, -0.05230712890625, -0.06695556640625, -0.0543212890625, 0.0175781...
open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k
2023-10-27T04:31:58.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T11:26:59
--- pretty_name: Evaluation run of mncai/Mistral-7B-openplatypus-1k dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T04:31:44.728538](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-27T04-31-44.728538.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\ \ \"em_stderr\": 0.00045666764626669425,\n \"f1\": 0.06536912751677865,\n\ \ \"f1_stderr\": 0.001427220169024926,\n \"acc\": 0.47155979662189373,\n\ \ \"acc_stderr\": 0.01115073074341337\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669425,\n\ \ \"f1\": 0.06536912751677865,\n \"f1_stderr\": 0.001427220169024926\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17437452615617893,\n \ \ \"acc_stderr\": 0.010451421361976233\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\ \ }\n}\n```" repo_url: https://huggingface.co/mncai/Mistral-7B-openplatypus-1k leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T04_31_44.728538 path: - '**/details_harness|drop|3_2023-10-27T04-31-44.728538.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T04-31-44.728538.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T04_31_44.728538 path: - '**/details_harness|gsm8k|5_2023-10-27T04-31-44.728538.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T04-31-44.728538.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T11_26_36.133476 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T04_31_44.728538 path: - '**/details_harness|winogrande|5_2023-10-27T04-31-44.728538.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T04-31-44.728538.parquet' - config_name: results data_files: - split: 2023_10_10T11_26_36.133476 path: - results_2023-10-10T11-26-36.133476.parquet - split: 2023_10_27T04_31_44.728538 path: - results_2023-10-27T04-31-44.728538.parquet - split: latest path: - results_2023-10-27T04-31-44.728538.parquet --- # Dataset Card for Evaluation run of mncai/Mistral-7B-openplatypus-1k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mncai/Mistral-7B-openplatypus-1k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T04:31:44.728538](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-27T04-31-44.728538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669425, "f1": 0.06536912751677865, "f1_stderr": 0.001427220169024926, "acc": 0.47155979662189373, "acc_stderr": 0.01115073074341337 }, "harness|drop|3": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669425, "f1": 0.06536912751677865, "f1_stderr": 0.001427220169024926 }, "harness|gsm8k|5": { "acc": 0.17437452615617893, "acc_stderr": 0.010451421361976233 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.01185004012485051 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,710
[ [ -0.0325927734375, -0.04364013671875, 0.00853729248046875, 0.020233154296875, -0.01406097412109375, -0.0016765594482421875, -0.0264129638671875, -0.009429931640625, 0.034637451171875, 0.04248046875, -0.05450439453125, -0.07025146484375, -0.045928955078125, 0....
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down
2023-10-28T17:20:30.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T11:32:28
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T17:20:18.070512](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-28T17-20-18.070512.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3329488255033557,\n\ \ \"em_stderr\": 0.0048262295177582465,\n \"f1\": 0.37401635906040315,\n\ \ \"f1_stderr\": 0.004743003734543155,\n \"acc\": 0.444993426772692,\n\ \ \"acc_stderr\": 0.010459654838365608\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.3329488255033557,\n \"em_stderr\": 0.0048262295177582465,\n\ \ \"f1\": 0.37401635906040315,\n \"f1_stderr\": 0.004743003734543155\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \ \ \"acc_stderr\": 0.009041108602874675\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856542\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|arc:challenge|25_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T11-32-04.979499.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T17_20_18.070512 path: - '**/details_harness|drop|3_2023-10-28T17-20-18.070512.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T17-20-18.070512.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T17_20_18.070512 path: - '**/details_harness|gsm8k|5_2023-10-28T17-20-18.070512.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T17-20-18.070512.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hellaswag|10_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T11_32_04.979499 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-32-04.979499.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-32-04.979499.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T17_20_18.070512 path: - '**/details_harness|winogrande|5_2023-10-28T17-20-18.070512.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T17-20-18.070512.parquet' - config_name: results data_files: - split: 2023_10_10T11_32_04.979499 path: - results_2023-10-10T11-32-04.979499.parquet - split: 2023_10_28T17_20_18.070512 path: - results_2023-10-28T17-20-18.070512.parquet - split: latest path: - results_2023-10-28T17-20-18.070512.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T17:20:18.070512](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-28T17-20-18.070512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3329488255033557, "em_stderr": 0.0048262295177582465, "f1": 0.37401635906040315, "f1_stderr": 0.004743003734543155, "acc": 0.444993426772692, "acc_stderr": 0.010459654838365608 }, "harness|drop|3": { "em": 0.3329488255033557, "em_stderr": 0.0048262295177582465, "f1": 0.37401635906040315, "f1_stderr": 0.004743003734543155 }, "harness|gsm8k|5": { "acc": 0.12282031842304776, "acc_stderr": 0.009041108602874675 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856542 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,006
[ [ -0.027618408203125, -0.054107666015625, 0.0174560546875, 0.0183563232421875, -0.0140533447265625, 0.0120086669921875, -0.024993896484375, -0.0198974609375, 0.03253173828125, 0.039154052734375, -0.05169677734375, -0.06689453125, -0.052886962890625, 0.01713562...
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down
2023-10-26T15:53:22.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T11:38:47
--- pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T15:53:08.381645](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-26T15-53-08.381645.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2154991610738255,\n\ \ \"em_stderr\": 0.004210747014430766,\n \"f1\": 0.25919148489932897,\n\ \ \"f1_stderr\": 0.004195696877017449,\n \"acc\": 0.4490387889225113,\n\ \ \"acc_stderr\": 0.01073317504472215\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2154991610738255,\n \"em_stderr\": 0.004210747014430766,\n\ \ \"f1\": 0.25919148489932897,\n \"f1_stderr\": 0.004195696877017449\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1372251705837756,\n \ \ \"acc_stderr\": 0.009477808244600398\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843905\n\ \ }\n}\n```" repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T15_53_08.381645 path: - '**/details_harness|drop|3_2023-10-26T15-53-08.381645.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T15-53-08.381645.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T15_53_08.381645 path: - '**/details_harness|gsm8k|5_2023-10-26T15-53-08.381645.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T15-53-08.381645.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T11_38_23.134636 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T15_53_08.381645 path: - '**/details_harness|winogrande|5_2023-10-26T15-53-08.381645.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T15-53-08.381645.parquet' - config_name: results data_files: - split: 2023_10_10T11_38_23.134636 path: - results_2023-10-10T11-38-23.134636.parquet - split: 2023_10_26T15_53_08.381645 path: - results_2023-10-26T15-53-08.381645.parquet - split: latest path: - results_2023-10-26T15-53-08.381645.parquet --- # Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T15:53:08.381645](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-26T15-53-08.381645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2154991610738255, "em_stderr": 0.004210747014430766, "f1": 0.25919148489932897, "f1_stderr": 0.004195696877017449, "acc": 0.4490387889225113, "acc_stderr": 0.01073317504472215 }, "harness|drop|3": { "em": 0.2154991610738255, "em_stderr": 0.004210747014430766, "f1": 0.25919148489932897, "f1_stderr": 0.004195696877017449 }, "harness|gsm8k|5": { "acc": 0.1372251705837756, "acc_stderr": 0.009477808244600398 }, "harness|winogrande|5": { "acc": 0.760852407261247, "acc_stderr": 0.011988541844843905 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,902
[ [ -0.0282135009765625, -0.053375244140625, 0.0181121826171875, 0.0190887451171875, -0.01378631591796875, 0.0119476318359375, -0.025146484375, -0.0201263427734375, 0.032623291015625, 0.03900146484375, -0.053070068359375, -0.0675048828125, -0.053741455078125, 0....
polinaeterna/OpenOrca
2023-10-10T11:45:53.000Z
[ "task_categories:conversational", "task_categories:text-classification", "task_categories:token-classification", "task_categories:table-question-answering", "task_categories:question-answering", "task_categories:zero-shot-classification", "task_categories:summarization", "task_categories:feature-extra...
polinaeterna
null
null
0
0
2023-10-10T11:45:53
--- language: - en license: mit task_categories: - conversational - text-classification - token-classification - table-question-answering - question-answering - zero-shot-classification - summarization - feature-extraction - text-generation - text2text-generation pretty_name: OpenOrca size_categories: - 10M<n<100M --- ## Table of Contents - [Dataset Summary](#dataset-summary) - [Dataset Attribution](#dataset-attribution) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Dataset Use](#dataset-use) - [Use Cases](#use-cases) - [Usage Caveats](#usage-caveats) - [Getting Started](#getting-started) <p><h1>🐋 The OpenOrca Dataset! 🐋</h1></p> ![OpenOrca Logo](https://huggingface.co/datasets/Open-Orca/OpenOrca/resolve/main/OpenOrcaLogo.png "OpenOrca Logo") <a name="dataset-announcement"></a> We are thrilled to announce the release of the OpenOrca dataset! This rich collection of augmented FLAN data aligns, as best as possible, with the distributions outlined in the [Orca paper](https://arxiv.org/abs/2306.02707). It has been instrumental in generating high-performing model checkpoints and serves as a valuable resource for all NLP researchers and developers! # Official Models ## Mistral-7B-OpenOrca Our [latest model](https://huggingface.co/spaces/Open-Orca/Mistral-7B-OpenOrca), the first 7B to score better overall than all previous models below 30B. 98% of Llama2-70b-chat's performance, in a completely open 7B! ## OpenOrca-Platypus2-13B Our [third model](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B), the first 13B model to score higher than LLaMA1-65B on the HuggingFace Leaderboard! Released in partnership with Platypus. ## LlongOrca 7B & 13B * Our [first 7B release](https://huggingface.co/Open-Orca/LlongOrca-7B-16k), trained on top of LLongMA2 to achieve 16,000 tokens context. #1 long context 7B model at release time, with >99% of the overall #1 model's performance. * [LlongOrca-13B-16k](https://huggingface.co/Open-Orca/LlongOrca-13B-16k), trained on top of LLongMA2. #1 long context 13B model at release time, with >97% of the overall #1 model's performance. ## OpenOrcaxOpenChat-Preview2-13B Our [second model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B), highlighting that we've surpassed the performance reported in the Orca paper. Was #1 at release time, now surpassed by our own OpenOrca-Platypus2-13B. Released in partnership with OpenChat. ## OpenOrca-Preview1-13B [OpenOrca-Preview1-13B](https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B) This model was trained in less than a day, for <$200, with <10% of our data. At release, it beat the current state of the art models on BigBench-Hard and AGIEval. Achieves ~60% of the improvements reported in the Orca paper. <a name="dataset-summary"></a> # Dataset Summary The OpenOrca dataset is a collection of augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688). Currently ~1M GPT-4 completions, and ~3.2M GPT-3.5 completions. It is tabularized in alignment with the distributions presented in the ORCA paper and currently represents a partial completion of the full intended dataset, with ongoing generation to expand its scope. The data is primarily used for training and evaluation in the field of natural language processing. <a name="dataset-attribution"></a> # Dataset Attribution We would like to give special recognition to the following contributors for their significant efforts and dedication: Teknium WingLian/Caseus Eric Hartford NanoBit Pankaj Winddude Rohan http://AlignmentLab.ai: Autometa Entropi AtlasUnified NeverendingToast NanoBit WingLian/Caseus Also of course, as always, TheBloke, for being the backbone of the whole community. Many thanks to NanoBit and Caseus, makers of [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl), for lending us their expertise on the platform that developed and trained manticore, minotaur, and many others! We are welcoming sponsors or collaborators to help us build these models to the scale they deserve. Please reach out via our socials: http://Alignmentlab.ai https://discord.gg/n9hXaBPWxx Want to visualize our full dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2). [<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2) <a name="supported-tasks-and-leaderboards"></a> # Supported Tasks and Leaderboards This dataset supports a range of tasks including language modeling, text generation, and text augmentation. It has been instrumental in the generation of multiple high-performing model checkpoints which have exhibited exceptional performance in our unit testing. Further information on leaderboards will be updated as they become available. <a name="languages"></a> # Languages The language of the data is primarily English. <a name="dataset-structure"></a> # Dataset Structure <a name="data-instances"></a> ## Data Instances A data instance in this dataset represents entries from the FLAN collection which have been augmented by submitting the listed question to either GPT-4 or GPT-3.5. The response is then entered into the response field. <a name="data-fields"></a> ## Data Fields The fields are: 1) 'id', a unique numbered identifier which includes one of 'niv', 't0', 'cot', or 'flan' to represent which source FLAN Collection submix the 'question' is sourced from. 2) 'system_prompt', representing the System Prompt presented to the GPT-3.5 or GPT-4 API for the datapoint 3) 'question', representing a question entry as provided by the FLAN Collection 4) 'response', a response to that question received from a query to either GPT-3.5 or GPT-4. <a name="data-splits"></a> ## Data Splits The data is unsplit. <a name="dataset-creation"></a> # Dataset Creation <a name="curation-rationale"></a> ## Curation Rationale The dataset was created to provide a source of augmented text data for researchers and developers. The datapoints are intended primarily to provide an enhancement of the core FLAN Collection data which relies upon the detailed step by step reasoning capabilities of GPT-3.5 and GPT-4. This "reasoning trace" augmentation has demonstrated exceptional results, allowing a LLaMA-13B model trained with this data to rival or beat GPT-3.5 on broad sets of hard reasoning tasks which all models below 100B parameters had previously performed dramatically worse on. <a name="source-data"></a> ## Source Data The data is generated using techniques in alignment with the distributions outlined in the Orca paper, except as noted below: 1) There is not enough CoT data in the FLAN Collection to generate 150K zero-shot entries, as the paper purports to use. We suspect this portion was either undocumented or misrepresented. We have used the ~75K points available. 2) We used the pre-generated FLAN Collection datasets hosted on HuggingFace under conceptofmind, e.g. [conceptofmind/flan2021](https://huggingface.co/datasets/conceptofmind/flan2021_submix_original). These are referenced by the [official FLAN Collection repo](https://github.com/google-research/FLAN/tree/main/flan/v2) as the preferred data source. However, these are a subset of the full FLAN Collection data, and have less than the required entries for the flan2021 and t0 submixes, by ~1.25M and 200k respectively. Combined, this gave us ~1.5M fewer datapoints than in the original Orca paper. Completing the set is an ongoing work. <a name="dataset-use"></a> # Dataset Use <a name="use-cases"></a> ## Use Cases The dataset can be used for tasks related to language understanding, natural language processing, machine learning model training, and model performance evaluation. <a name="usage-caveats"></a> ## Usage Caveats Given that this is a work-in-progress dataset, it is recommended to regularly check for updates and improvements. Further, the data should be used in accordance with the guidelines and recommendations outlined in the Orca paper. <a name="getting-started"></a> ## Getting Started This dataset is organized such that it can be naively loaded via Hugging Face datasets library. We recommend using streaming due to the large size of the files. Regular updates and data generation progress can be monitored through the OpenOrca repository on Hugging Face. # Citation ```bibtex @misc{OpenOrca, title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces}, author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}, } ``` ```bibtex @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtex @misc{longpre2023flan, title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning}, author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts}, year={2023}, eprint={2301.13688}, archivePrefix={arXiv}, primaryClass={cs.AI} } ``` ```bibtex @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint= arXiv 2307.09288 } @software{touvron2023llama, title={LLaMA: Open and Efficient Foundation Language Models}, author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume}, journal={arXiv preprint arXiv:2302.13971}, year={2023} } ```
11,959
[ [ -0.0445556640625, -0.05316162109375, 0.01392364501953125, -0.0010623931884765625, -0.002964019775390625, -0.01245880126953125, -0.01313018798828125, -0.06402587890625, 0.03668212890625, 0.036041259765625, -0.034088134765625, -0.051300048828125, -0.02914428710937...
ordererecprime/ErecPrime
2023-10-10T11:50:40.000Z
[ "region:us" ]
ordererecprime
null
null
0
0
2023-10-10T11:47:21
Have you also tried out a plethora of products on the market that claim to boost male health but only ever got disappointed? If yes, ErecPrime is the last product that you’ll be required to **[try out now](https://snoppymart.com/erecprime/)**. The advanced and natural formula of the magical [ErecPrime](https://snoppymart.com/erecprime/) is created to support virility and boost libido in men. With its all-natural composition, the tonic has been proven to improve the overall health and well-being of customers. Every ingredient used in this product has been well-researched and only then selected to ensure safe and effective usage of the ErecPrime. [![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEijO8y3_7DsM2OGdhXKnlLNm9bE5mWcbDV3SdfPLkJB9krwSm4wxoj06dqQtSx7fdft0AdH872Ef6CXWFuven0EdTwK-5CQg0Bed9CJ0k7c9wH69Q4PK22Md6PUVQRSRNz7LNCeexCgVdWzkLU3b6aqjMrdKZNFUK5lD_asT-OsZEMV12XlNpQ6Oq_rEN0/w640-h500/Screenshot%20(1389).png)](https://snoppymart.com/erecprime/) As their website claims, over 88,730 users have tried and loved this product! This boosts the faith of the maker in their product and supports their claim of being one of the most effective and purest male health solutions. There’s a lot more that you need to know about the ErecPrime and that’s exactly what we are here for today. We’ll tell you how the tonic works, what are its benefits, how it is priced, and much more! You must read this detailed review till the end to make sure you don’t miss out on any crucial information. But first, let’s start with a quick summary of ErecPrime: **Product Category:** Health Supplement **Product Name:** ErecPrime **Health Focus:** Male Health **Product Form:** Capsules **Side Effects:** Currently, studies or ErecPrime Reviews have reported no side effects of using the supplement. (Check out the reviews!) **Key Features:** * Manufactured in the FDA-registered facility in the US * Made with natural ingredients * Ease of Use * GMO-free * Stimulant-free * No Side Effects **Benefits:** * Increases libido * Promotes healthy prostate * Boosts energy levels **Pricing:** A single bottle of ErecPrime costs $69. **Money-Back Guarantee:** Applicable for 60 days **ErecPrime Reviews:** ErecPrime Reviews are normally positive. **Where to Buy?** You can purchase the ErecPrime only from its official website. ### Buy Link >> [https://snoppymart.com/erecprime/](https://snoppymart.com/erecprime/) How Does The ErecPrime Work? ---------------------------- All you have to do is take a capsule of ErecPrime every day with plentiful water. With its extraordinary formulation of mineral and plant-based extracts, the tonic will then work to target a particular enzyme, called the ‘erection enzyme’ that is responsible for your sexual performance. This enzyme relaxes all the muscles in the penis and increases the production of nitric oxide in your body. This further improves the blood circulation to your penis and results in hard and long erections. The product gradually prepares your penis to start getting erections naturally which once seemed to be a thing of the past. [![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEijO8y3_7DsM2OGdhXKnlLNm9bE5mWcbDV3SdfPLkJB9krwSm4wxoj06dqQtSx7fdft0AdH872Ef6CXWFuven0EdTwK-5CQg0Bed9CJ0k7c9wH69Q4PK22Md6PUVQRSRNz7LNCeexCgVdWzkLU3b6aqjMrdKZNFUK5lD_asT-OsZEMV12XlNpQ6Oq_rEN0/w640-h500/Screenshot%20(1389).png)](https://snoppymart.com/erecprime/) What Are The Natural Ingredients That Go Into The Making Of ErecPrime? ---------------------------------------------------------------------- Let’s now take a look at the ingredients present in ErecPrime that make it as effective as it is for promoting male virility: ### Rehmanniae Radix At a molecular level, Rehmanniae Radix contains bioactive compounds such as iridoid glycosides, catalpol, and aucubin. These compounds exert their effects by interacting with various physiological processes in the body. One key mechanism by which Rehmanniae Radix supports workout performance is through its modulation of the hypothalamic-pituitary-adrenal (HPA) axis. The HPA axis plays a crucial role in the body’s response to stress and exercise. During intense physical activity, the HPA axis is activated, leading to the release of cortisol, a hormone that helps regulate energy metabolism and response to stress. However, excessive cortisol release can have detrimental effects on workout performance and energy levels. Rehmanniae Radix has been found to regulate the HPA axis and normalize cortisol levels. By doing so, it helps prevent the negative impact of excessive cortisol release during intense exercise. This regulation of cortisol levels contributes to improved workout performance and enhanced energy levels in men. [Get started with the ErecPrime today!](https://snoppymart.com/erecprime/)
4,870
[ [ -0.00806427001953125, -0.0516357421875, 0.036834716796875, 0.0206756591796875, -0.003543853759765625, 0.011688232421875, -0.0035839080810546875, -0.0645751953125, 0.060791015625, 0.0159454345703125, -0.045135498046875, -0.034576416015625, -0.0018243789672851562,...
Ramzey/processed_bert_dataset
2023-10-10T13:02:03.000Z
[ "region:us" ]
Ramzey
null
null
0
0
2023-10-10T11:56:07
--- dataset_info: features: - name: input_ids sequence: int32 - name: token_type_ids sequence: int8 - name: attention_mask sequence: int8 - name: special_tokens_mask sequence: int8 splits: - name: train num_bytes: 576000.0 num_examples: 160 download_size: 0 dataset_size: 576000.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "processed_bert_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
589
[ [ -0.043609619140625, -0.0272369384765625, 0.0171661376953125, 0.025054931640625, -0.01641845703125, -0.005863189697265625, 0.005680084228515625, -0.023345947265625, 0.060302734375, 0.0360107421875, -0.0716552734375, -0.045257568359375, -0.035186767578125, -0....
mesude/turkishReviews-mini
2023-10-10T12:02:30.000Z
[ "region:us" ]
mesude
null
null
0
0
2023-10-10T12:02:30
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking
2023-10-10T12:12:04.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T12:11:05
--- pretty_name: Evaluation run of xDAN-AI/xDAN-L1-Thinking dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [xDAN-AI/xDAN-L1-Thinking](https://huggingface.co/xDAN-AI/xDAN-L1-Thinking) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T12:10:41.690417](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking/blob/main/results_2023-10-10T12-10-41.690417.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6289386722445444,\n\ \ \"acc_stderr\": 0.033198865884709695,\n \"acc_norm\": 0.6328416238634791,\n\ \ \"acc_norm_stderr\": 0.03317472328982102,\n \"mc1\": 0.3659730722154223,\n\ \ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5212953226899916,\n\ \ \"mc2_stderr\": 0.015376349056492798\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229321,\n\ \ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955012\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n\ \ \"acc_stderr\": 0.0047349726682996175,\n \"acc_norm\": 0.8453495319657439,\n\ \ \"acc_norm_stderr\": 0.0036083220651418903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\ \ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\ \ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \ \ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\ \ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\ \ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\ \ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\ acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\ \ \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.7580645161290323,\n\ \ \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\ \ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\ \ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \ \ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \ \ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \ \ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431378,\n \"\ acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431378\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\ acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \ \ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\ \ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\ \ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\ \ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\ \ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\ \ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\ \ \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n\ \ \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\ \ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\ \ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\ \ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \ \ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\ \ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\ \ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\ \ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \ \ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\ \ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\ \ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\ \ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\ \ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5212953226899916,\n\ \ \"mc2_stderr\": 0.015376349056492798\n }\n}\n```" repo_url: https://huggingface.co/xDAN-AI/xDAN-L1-Thinking leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|arc:challenge|25_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hellaswag|10_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T12_10_41.690417 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-10-41.690417.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-10-41.690417.parquet' - config_name: results data_files: - split: 2023_10_10T12_10_41.690417 path: - results_2023-10-10T12-10-41.690417.parquet - split: latest path: - results_2023-10-10T12-10-41.690417.parquet --- # Dataset Card for Evaluation run of xDAN-AI/xDAN-L1-Thinking ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xDAN-AI/xDAN-L1-Thinking - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [xDAN-AI/xDAN-L1-Thinking](https://huggingface.co/xDAN-AI/xDAN-L1-Thinking) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T12:10:41.690417](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking/blob/main/results_2023-10-10T12-10-41.690417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6289386722445444, "acc_stderr": 0.033198865884709695, "acc_norm": 0.6328416238634791, "acc_norm_stderr": 0.03317472328982102, "mc1": 0.3659730722154223, "mc1_stderr": 0.01686294168408838, "mc2": 0.5212953226899916, "mc2_stderr": 0.015376349056492798 }, "harness|arc:challenge|25": { "acc": 0.5947098976109215, "acc_stderr": 0.014346869060229321, "acc_norm": 0.6373720136518771, "acc_norm_stderr": 0.014049106564955012 }, "harness|hellaswag|10": { "acc": 0.6577375024895439, "acc_stderr": 0.0047349726682996175, "acc_norm": 0.8453495319657439, "acc_norm_stderr": 0.0036083220651418903 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.038424985593952694, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.038424985593952694 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.028985455652334388, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.028985455652334388 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.032400380867927465, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.032400380867927465 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.025253032554997692, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.025253032554997692 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.024362599693031093, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.024362599693031093 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.02338193534812143, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.02338193534812143 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6512820512820513, "acc_stderr": 0.02416278028401772, "acc_norm": 0.6512820512820513, "acc_norm_stderr": 0.02416278028401772 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.03077805742293167, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.03077805742293167 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8238532110091743, "acc_stderr": 0.016332882393431378, "acc_norm": 0.8238532110091743, "acc_norm_stderr": 0.016332882393431378 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601443, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601443 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841403, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841403 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8058748403575989, "acc_stderr": 0.01414397027665757, "acc_norm": 0.8058748403575989, "acc_norm_stderr": 0.01414397027665757 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.024547617794803828, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.024547617794803828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30614525139664805, "acc_stderr": 0.015414494487903219, "acc_norm": 0.30614525139664805, "acc_norm_stderr": 0.015414494487903219 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885135, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885135 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7253086419753086, "acc_stderr": 0.024836057868294677, "acc_norm": 0.7253086419753086, "acc_norm_stderr": 0.024836057868294677 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6601307189542484, "acc_stderr": 0.01916241858862356, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.01916241858862356 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.028920583220675606, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.028920583220675606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.3659730722154223, "mc1_stderr": 0.01686294168408838, "mc2": 0.5212953226899916, "mc2_stderr": 0.015376349056492798 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,889
[ [ -0.052032470703125, -0.05560302734375, 0.0181884765625, 0.01169586181640625, -0.00946044921875, -0.0034332275390625, 0.001583099365234375, -0.015899658203125, 0.04241943359375, -0.00557708740234375, -0.035736083984375, -0.0465087890625, -0.0283355712890625, ...
amphora/lmsys-finance
2023-10-10T12:25:26.000Z
[ "task_categories:conversational", "size_categories:n<1K", "language:en", "finance", "region:us" ]
amphora
null
null
0
0
2023-10-10T12:16:02
--- dataset_info: features: - name: conversation_id dtype: string - name: model dtype: string - name: conversation dtype: string - name: turn dtype: int64 - name: language dtype: string - name: openai_moderation dtype: string - name: redacted dtype: bool - name: count dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 10328855 num_examples: 735 download_size: 3912614 dataset_size: 10328855 task_categories: - conversational language: - en tags: - finance size_categories: - n<1K --- # Dataset Card for "lmsys-finance" This dataset is a curated version of the [lmsys-chat-1m](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) dataset, focusing solely on finance-related conversations. The refinement process encompassed: 1. Removing non-English conversations. 2. Selecting conversations from models: "vicuna-33b", "wizardlm-13b", "gpt-4", "gpt-3.5-turbo", "claude-2", "palm-2", and "claude-instant-1". 3. Excluding conversations with responses under 30 characters. 4. Using 100 financial keywords, choosing conversations with at least 10 keywords.
1,161
[ [ -0.0301666259765625, -0.06500244140625, 0.00800323486328125, 0.0013332366943359375, -0.0225677490234375, 0.0289764404296875, -0.0164642333984375, -0.0279998779296875, 0.0491943359375, 0.06640625, -0.0955810546875, -0.04913330078125, -0.003322601318359375, 0....
Omar1010/maraton
2023-10-10T12:28:25.000Z
[ "region:us" ]
Omar1010
null
null
0
0
2023-10-10T12:28:25
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
Star-gazer/WIX1002
2023-10-10T13:41:21.000Z
[ "license:cc-by-nc-4.0", "region:us" ]
Star-gazer
null
null
0
0
2023-10-10T12:38:03
--- license: cc-by-nc-4.0 --- These are the datasets for the assignment of WIX1002 2023. Credit to : data.gov.my
114
[ [ -0.006443023681640625, 0.003753662109375, 0.03582763671875, 0.03668212890625, 0.03558349609375, -0.0159912109375, 0.050323486328125, -0.004756927490234375, 0.01444244384765625, 0.09735107421875, -0.07391357421875, -0.00033092498779296875, -0.01483917236328125, ...
sleepyboyeyes/Caroline
2023-10-10T20:01:26.000Z
[ "region:us" ]
sleepyboyeyes
null
null
0
0
2023-10-10T12:44:21
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_Yukang__LongAlpaca-7B
2023-10-27T07:49:54.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T12:48:41
--- pretty_name: Evaluation run of Yukang/LongAlpaca-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/LongAlpaca-7B](https://huggingface.co/Yukang/LongAlpaca-7B) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__LongAlpaca-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T07:49:41.202175](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-7B/blob/main/results_2023-10-27T07-49-41.202175.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.019190436241610737,\n\ \ \"em_stderr\": 0.001404993842617801,\n \"f1\": 0.08690436241610738,\n\ \ \"f1_stderr\": 0.002001281340660649,\n \"acc\": 0.3007103393843725,\n\ \ \"acc_stderr\": 0.00688017858843692\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.019190436241610737,\n \"em_stderr\": 0.001404993842617801,\n\ \ \"f1\": 0.08690436241610738,\n \"f1_stderr\": 0.002001281340660649\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.601420678768745,\n\ \ \"acc_stderr\": 0.01376035717687384\n }\n}\n```" repo_url: https://huggingface.co/Yukang/LongAlpaca-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|arc:challenge|25_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T12-48-17.445800.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T07_49_41.202175 path: - '**/details_harness|drop|3_2023-10-27T07-49-41.202175.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T07-49-41.202175.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T07_49_41.202175 path: - '**/details_harness|gsm8k|5_2023-10-27T07-49-41.202175.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T07-49-41.202175.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hellaswag|10_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T12_48_17.445800 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-48-17.445800.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-48-17.445800.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T07_49_41.202175 path: - '**/details_harness|winogrande|5_2023-10-27T07-49-41.202175.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T07-49-41.202175.parquet' - config_name: results data_files: - split: 2023_10_10T12_48_17.445800 path: - results_2023-10-10T12-48-17.445800.parquet - split: 2023_10_27T07_49_41.202175 path: - results_2023-10-27T07-49-41.202175.parquet - split: latest path: - results_2023-10-27T07-49-41.202175.parquet --- # Dataset Card for Evaluation run of Yukang/LongAlpaca-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/LongAlpaca-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/LongAlpaca-7B](https://huggingface.co/Yukang/LongAlpaca-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__LongAlpaca-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T07:49:41.202175](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-7B/blob/main/results_2023-10-27T07-49-41.202175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.019190436241610737, "em_stderr": 0.001404993842617801, "f1": 0.08690436241610738, "f1_stderr": 0.002001281340660649, "acc": 0.3007103393843725, "acc_stderr": 0.00688017858843692 }, "harness|drop|3": { "em": 0.019190436241610737, "em_stderr": 0.001404993842617801, "f1": 0.08690436241610738, "f1_stderr": 0.002001281340660649 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.601420678768745, "acc_stderr": 0.01376035717687384 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,479
[ [ -0.033599853515625, -0.049957275390625, 0.0181121826171875, 0.025115966796875, -0.0183563232421875, 0.0022430419921875, -0.03131103515625, -0.0206451416015625, 0.035003662109375, 0.0462646484375, -0.04949951171875, -0.0703125, -0.047576904296875, 0.017395019...
MoaazId/cityscape_Fine
2023-10-10T13:07:28.000Z
[ "region:us" ]
MoaazId
null
null
0
0
2023-10-10T12:51:08
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_undi95__llama2-to-mistral-diff
2023-10-25T09:38:01.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T12:56:12
--- pretty_name: Evaluation run of undi95/llama2-to-mistral-diff dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_undi95__llama2-to-mistral-diff\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T09:37:53.083823](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-25T09-37-53.083823.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\ \ \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05605494966442959,\n\ \ \"f1_stderr\": 0.0013169501309663063,\n \"acc\": 0.4076941764856182,\n\ \ \"acc_stderr\": 0.009790166925519655\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n\ \ \"f1\": 0.05605494966442959,\n \"f1_stderr\": 0.0013169501309663063\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \ \ \"acc_stderr\": 0.007257633145486643\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\ \ }\n}\n```" repo_url: https://huggingface.co/undi95/llama2-to-mistral-diff leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T07_59_15.869817 path: - '**/details_harness|drop|3_2023-10-24T07-59-15.869817.parquet' - split: 2023_10_25T09_37_53.083823 path: - '**/details_harness|drop|3_2023-10-25T09-37-53.083823.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T09-37-53.083823.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T07_59_15.869817 path: - '**/details_harness|gsm8k|5_2023-10-24T07-59-15.869817.parquet' - split: 2023_10_25T09_37_53.083823 path: - '**/details_harness|gsm8k|5_2023-10-25T09-37-53.083823.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T09-37-53.083823.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T12_55_48.397880 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T07_59_15.869817 path: - '**/details_harness|winogrande|5_2023-10-24T07-59-15.869817.parquet' - split: 2023_10_25T09_37_53.083823 path: - '**/details_harness|winogrande|5_2023-10-25T09-37-53.083823.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T09-37-53.083823.parquet' - config_name: results data_files: - split: 2023_10_10T12_55_48.397880 path: - results_2023-10-10T12-55-48.397880.parquet - split: 2023_10_24T07_59_15.869817 path: - results_2023-10-24T07-59-15.869817.parquet - split: 2023_10_25T09_37_53.083823 path: - results_2023-10-25T09-37-53.083823.parquet - split: latest path: - results_2023-10-25T09-37-53.083823.parquet --- # Dataset Card for Evaluation run of undi95/llama2-to-mistral-diff ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/undi95/llama2-to-mistral-diff - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_undi95__llama2-to-mistral-diff", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T09:37:53.083823](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-25T09-37-53.083823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05605494966442959, "f1_stderr": 0.0013169501309663063, "acc": 0.4076941764856182, "acc_stderr": 0.009790166925519655 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05605494966442959, "f1_stderr": 0.0013169501309663063 }, "harness|gsm8k|5": { "acc": 0.07505686125852919, "acc_stderr": 0.007257633145486643 }, "harness|winogrande|5": { "acc": 0.7403314917127072, "acc_stderr": 0.012322700705552667 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
39,131
[ [ -0.026123046875, -0.040771484375, 0.0180511474609375, 0.0232391357421875, -0.014373779296875, 0.00806427001953125, -0.0213775634765625, -0.01407623291015625, 0.02294921875, 0.0350341796875, -0.0479736328125, -0.0638427734375, -0.054046630859375, 0.0178527832...
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype
2023-10-27T22:16:33.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:00:24
--- pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-19b-prototype dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [The-Face-Of-Goonery/Huginn-19b-prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T22:16:21.455804](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype/blob/main/results_2023-10-27T22-16-21.455804.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.055264261744966445,\n\ \ \"em_stderr\": 0.0023400062101028673,\n \"f1\": 0.1135434144295301,\n\ \ \"f1_stderr\": 0.0025693901510907753,\n \"acc\": 0.4039910888938488,\n\ \ \"acc_stderr\": 0.008790747649701043\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.055264261744966445,\n \"em_stderr\": 0.0023400062101028673,\n\ \ \"f1\": 0.1135434144295301,\n \"f1_stderr\": 0.0025693901510907753\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04397270659590599,\n \ \ \"acc_stderr\": 0.005647666449126459\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275626\n\ \ }\n}\n```" repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-00-00.797867.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T22_16_21.455804 path: - '**/details_harness|drop|3_2023-10-27T22-16-21.455804.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T22-16-21.455804.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T22_16_21.455804 path: - '**/details_harness|gsm8k|5_2023-10-27T22-16-21.455804.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T22-16-21.455804.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hellaswag|10_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_00_00.797867 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-00-00.797867.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-00-00.797867.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T22_16_21.455804 path: - '**/details_harness|winogrande|5_2023-10-27T22-16-21.455804.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T22-16-21.455804.parquet' - config_name: results data_files: - split: 2023_10_10T13_00_00.797867 path: - results_2023-10-10T13-00-00.797867.parquet - split: 2023_10_27T22_16_21.455804 path: - results_2023-10-27T22-16-21.455804.parquet - split: latest path: - results_2023-10-27T22-16-21.455804.parquet --- # Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-19b-prototype ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-19b-prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T22:16:21.455804](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype/blob/main/results_2023-10-27T22-16-21.455804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.055264261744966445, "em_stderr": 0.0023400062101028673, "f1": 0.1135434144295301, "f1_stderr": 0.0025693901510907753, "acc": 0.4039910888938488, "acc_stderr": 0.008790747649701043 }, "harness|drop|3": { "em": 0.055264261744966445, "em_stderr": 0.0023400062101028673, "f1": 0.1135434144295301, "f1_stderr": 0.0025693901510907753 }, "harness|gsm8k|5": { "acc": 0.04397270659590599, "acc_stderr": 0.005647666449126459 }, "harness|winogrande|5": { "acc": 0.7640094711917916, "acc_stderr": 0.011933828850275626 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,800
[ [ -0.0283966064453125, -0.052032470703125, 0.0160369873046875, 0.01316070556640625, -0.00751495361328125, 0.005405426025390625, -0.0244903564453125, -0.0186614990234375, 0.035980224609375, 0.040191650390625, -0.0546875, -0.059051513671875, -0.045440673828125, ...
jjzha/imdb-dutch-instruct
2023-10-10T13:03:55.000Z
[ "size_categories:10K<n<100K", "language:nl", "license:apache-2.0", "region:us" ]
jjzha
null
null
0
0
2023-10-10T13:02:52
--- language: - nl license: - apache-2.0 size_categories: - 10K<n<100K dataset_info: features: - name: inputs dtype: string - name: targets dtype: string splits: - name: train num_examples: 24992 - name: test num_examples: 24992 --- # Dataset Card for "imdb-dutch-instruct" ## Dataset Description The original IMBD dataset was translated to Dutch with [yhavinga/ul2-large-en-nl](https://huggingface.co/yhavinga/ul2-large-en-nl). Then, the dataset is converted to an instruct-style dataset with the following templates: The instruction templates: "Is deze recensie positief of negatief?", "Wat is het sentiment van de recensie?", "Wat voor toon heeft de volgende recensie?", "Met wat voor sentiment zou je deze recensie beoordelen?" The target templates: "De recensie is", "Gegeven de recensie, mijn antwoord is", "Deze recensie is", "De beoordeling hier is", "Het antwoord is" ### Dataset Summary Large Movie Review Dataset translated to Dutch converted to instruct style. This is a dataset for sentiment classification containing substantially more data than previous benchmark datasets. ### Languages and Example This dataset contains Dutch data. An example of 'train' looks as follows. ``` { "inputs": "Is deze recensie positief of negatief?\n\nIk heb alle vier de films in deze serie gezien. Elke film wijkt steeds verder af van de boeken. Deze is de ergste tot nu toe. Mijn probleem is dat hij op geen enkele manier het boek volgt waar hij naar genoemd is! De regisseurs en producenten hadden hem een andere naam moeten geven dan 'Love's Abiding Joy'. Het enige aan deze film dat ook maar in de verte op het boek lijkt, zijn de namen van sommige personages (Willie, Missie, Henry, Clark, Scottie en Cookie). De namen/ouders/verzorgers van de kinderen kloppen niet. De hele verhaallijn staat nergens in het boek. '<br />Ik vind het een grote belediging voor Janette Oke, haar boeken en haar fans om een film onder haar titel te produceren die in geen enkel opzicht correct is. De muziek is te hard. De acteurs zijn niet overtuigend <0xE2><0x80><0x93> ze missen emoties.<br />Als je een goede familiefilm wilt, is dit misschien goed. Het is schoon. Maar kijk er niet naar, als je hoopt op een verkorte versie van het boek. Ik hoop dat dit de laatste film uit deze serie zal zijn, maar ik betwijfel het. Als er meer films worden gemaakt, zou ik willen dat Michael Landon jr. en anderen dichter bij de oorspronkelijke plot en verhaallijn zouden blijven. De boeken zijn uitstekend en als je ze goed leest, zijn het uitstekende films!", "targets": "Het antwoord is negatief."} ``` ### Data Fields The data fields are the same among all splits. #### plain_text - `inputs`: a `string` feature, starting with a question whether the review is positive or negative. - `targets`: a `string` feature, with a template prefix and the final label . ### Data Splits | name |train|test | |----------|----:|----:| |plain_text|24992|24992| ### Official Citation Information The original data is from here: https://huggingface.co/datasets/yhavinga/imdb_dutch ``` @InProceedings{maas-EtAl:2011:ACL-HLT2011, author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher}, title = {Learning Word Vectors for Sentiment Analysis}, booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies}, month = {June}, year = {2011}, address = {Portland, Oregon, USA}, publisher = {Association for Computational Linguistics}, pages = {142--150}, url = {http://www.aclweb.org/anthology/P11-1015} } ``` Created by [Mike Zhang](https://jjzha.github.io/)
3,813
[ [ -0.050140380859375, -0.04730224609375, 0.00952911376953125, 0.015869140625, -0.0372314453125, -0.014190673828125, -0.01812744140625, -0.01568603515625, 0.06243896484375, 0.0310211181640625, -0.05841064453125, -0.04852294921875, -0.06500244140625, 0.020553588...
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft
2023-10-10T13:04:36.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:03:35
--- pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-32k-ft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/Llama-2-7b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T13:03:11.726005](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft/blob/main/results_2023-10-10T13-03-11.726005.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23095342099711472,\n\ \ \"acc_stderr\": 0.03068798097484314,\n \"acc_norm\": 0.23205178409700405,\n\ \ \"acc_norm_stderr\": 0.03070714755069479,\n \"mc1\": 0.2386780905752754,\n\ \ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4957068563437588,\n\ \ \"mc2_stderr\": 0.016914945574930968\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\ \ \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601341\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2570205138418642,\n\ \ \"acc_stderr\": 0.004360977256058745,\n \"acc_norm\": 0.2561242780322645,\n\ \ \"acc_norm_stderr\": 0.004355992090030995\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\ \ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\ \ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\ \ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \ \ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\ \ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\ \ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\ acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\ acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673621,\n \"\ acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673621\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\ acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\ \ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\ \ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \ \ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\ \ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\ acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\ acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\ acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\ \ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\ \ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\ \ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\ \ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\ \ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \ \ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\ \ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\ \ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\ \ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\ \ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\ \ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\ \ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\ \ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\ \ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\ \ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n\ \ \"mc2\": 0.4957068563437588,\n \"mc2_stderr\": 0.016914945574930968\n\ \ }\n}\n```" repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hellaswag|10_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_03_11.726005 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-03-11.726005.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-03-11.726005.parquet' - config_name: results data_files: - split: 2023_10_10T13_03_11.726005 path: - results_2023-10-10T13-03-11.726005.parquet - split: latest path: - results_2023-10-10T13-03-11.726005.parquet --- # Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-32k-ft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T13:03:11.726005](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft/blob/main/results_2023-10-10T13-03-11.726005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23095342099711472, "acc_stderr": 0.03068798097484314, "acc_norm": 0.23205178409700405, "acc_norm_stderr": 0.03070714755069479, "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456418, "mc2": 0.4957068563437588, "mc2_stderr": 0.016914945574930968 }, "harness|arc:challenge|25": { "acc": 0.21331058020477817, "acc_stderr": 0.011970971742326334, "acc_norm": 0.2790102389078498, "acc_norm_stderr": 0.013106784883601341 }, "harness|hellaswag|10": { "acc": 0.2570205138418642, "acc_stderr": 0.004360977256058745, "acc_norm": 0.2561242780322645, "acc_norm_stderr": 0.004355992090030995 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.19704433497536947, "acc_stderr": 0.02798672466673621, "acc_norm": 0.19704433497536947, "acc_norm_stderr": 0.02798672466673621 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02665353159671549, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02665353159671549 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.17959183673469387, "acc_stderr": 0.024573293589585637, "acc_norm": 0.17959183673469387, "acc_norm_stderr": 0.024573293589585637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456418, "mc2": 0.4957068563437588, "mc2_stderr": 0.016914945574930968 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,990
[ [ -0.049713134765625, -0.059600830078125, 0.021392822265625, 0.017425537109375, -0.0139312744140625, -0.004222869873046875, 0.0005006790161132812, -0.0192108154296875, 0.0418701171875, -0.00220489501953125, -0.033477783203125, -0.047393798828125, -0.0302734375, ...
open-llm-leaderboard/details_Envoid__Libra-19B
2023-10-28T19:47:00.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:07:09
--- pretty_name: Evaluation run of Envoid/Libra-19B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Envoid/Libra-19B](https://huggingface.co/Envoid/Libra-19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Envoid__Libra-19B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T19:46:48.305345](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Libra-19B/blob/main/results_2023-10-28T19-46-48.305345.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4041526845637584,\n\ \ \"em_stderr\": 0.005025506959675468,\n \"f1\": 0.4663181627516796,\n\ \ \"f1_stderr\": 0.0048334958525102995,\n \"acc\": 0.38198917766143897,\n\ \ \"acc_stderr\": 0.006352871239464968\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.4041526845637584,\n \"em_stderr\": 0.005025506959675468,\n\ \ \"f1\": 0.4663181627516796,\n \"f1_stderr\": 0.0048334958525102995\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \ \ \"acc_stderr\": 0.0007581501137225396\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207397\n\ \ }\n}\n```" repo_url: https://huggingface.co/Envoid/Libra-19B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-06-44.906506.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T19_46_48.305345 path: - '**/details_harness|drop|3_2023-10-28T19-46-48.305345.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T19-46-48.305345.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T19_46_48.305345 path: - '**/details_harness|gsm8k|5_2023-10-28T19-46-48.305345.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T19-46-48.305345.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hellaswag|10_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_06_44.906506 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-06-44.906506.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-06-44.906506.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T19_46_48.305345 path: - '**/details_harness|winogrande|5_2023-10-28T19-46-48.305345.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T19-46-48.305345.parquet' - config_name: results data_files: - split: 2023_10_10T13_06_44.906506 path: - results_2023-10-10T13-06-44.906506.parquet - split: 2023_10_28T19_46_48.305345 path: - results_2023-10-28T19-46-48.305345.parquet - split: latest path: - results_2023-10-28T19-46-48.305345.parquet --- # Dataset Card for Evaluation run of Envoid/Libra-19B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Envoid/Libra-19B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Envoid/Libra-19B](https://huggingface.co/Envoid/Libra-19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Envoid__Libra-19B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T19:46:48.305345](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Libra-19B/blob/main/results_2023-10-28T19-46-48.305345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4041526845637584, "em_stderr": 0.005025506959675468, "f1": 0.4663181627516796, "f1_stderr": 0.0048334958525102995, "acc": 0.38198917766143897, "acc_stderr": 0.006352871239464968 }, "harness|drop|3": { "em": 0.4041526845637584, "em_stderr": 0.005025506959675468, "f1": 0.4663181627516796, "f1_stderr": 0.0048334958525102995 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225396 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207397 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,501
[ [ -0.0350341796875, -0.05206298828125, 0.007411956787109375, 0.0169830322265625, -0.009735107421875, 0.005886077880859375, -0.0233001708984375, -0.013427734375, 0.032867431640625, 0.045501708984375, -0.04937744140625, -0.06524658203125, -0.041748046875, 0.0099...
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft
2023-10-27T07:10:17.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:09:13
--- pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-16k-ft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/Llama-2-7b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T07:10:03.989833](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft/blob/main/results_2023-10-27T07-10-03.989833.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\ acc\": 0.2430939226519337,\n \"acc_stderr\": 0.007023561458220208\n },\n\ \ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\ \ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\ : {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.4861878453038674,\n \"acc_stderr\": 0.014047122916440415\n\ \ }\n}\n```" repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-08-49.738155.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T07_10_03.989833 path: - '**/details_harness|drop|3_2023-10-27T07-10-03.989833.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T07-10-03.989833.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T07_10_03.989833 path: - '**/details_harness|gsm8k|5_2023-10-27T07-10-03.989833.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T07-10-03.989833.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hellaswag|10_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_08_49.738155 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-08-49.738155.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-08-49.738155.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T07_10_03.989833 path: - '**/details_harness|winogrande|5_2023-10-27T07-10-03.989833.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T07-10-03.989833.parquet' - config_name: results data_files: - split: 2023_10_10T13_08_49.738155 path: - results_2023-10-10T13-08-49.738155.parquet - split: 2023_10_27T07_10_03.989833 path: - results_2023-10-27T07-10-03.989833.parquet - split: latest path: - results_2023-10-27T07-10-03.989833.parquet --- # Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-16k-ft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T07:10:03.989833](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft/blob/main/results_2023-10-27T07-10-03.989833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0, "acc": 0.2430939226519337, "acc_stderr": 0.007023561458220208 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4861878453038674, "acc_stderr": 0.014047122916440415 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,365
[ [ -0.0277252197265625, -0.04937744140625, 0.0180206298828125, 0.02471923828125, -0.021697998046875, 0.006092071533203125, -0.033111572265625, -0.0249176025390625, 0.036376953125, 0.043609619140625, -0.0489501953125, -0.0699462890625, -0.047698974609375, 0.0223...
open-llm-leaderboard/details_Envoid__Yousei-22B
2023-10-26T01:22:32.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:10:05
--- pretty_name: Evaluation run of Envoid/Yousei-22B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Envoid/Yousei-22B](https://huggingface.co/Envoid/Yousei-22B) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Envoid__Yousei-22B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T01:22:19.156649](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Yousei-22B/blob/main/results_2023-10-26T01-22-19.156649.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15604026845637584,\n\ \ \"em_stderr\": 0.003716369253387427,\n \"f1\": 0.23708158557046954,\n\ \ \"f1_stderr\": 0.003820843210859161,\n \"acc\": 0.3598119404753428,\n\ \ \"acc_stderr\": 0.007269770584572424\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.15604026845637584,\n \"em_stderr\": 0.003716369253387427,\n\ \ \"f1\": 0.23708158557046954,\n \"f1_stderr\": 0.003820843210859161\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \ \ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7150749802683505,\n \"acc_stderr\": 0.012685986125141229\n\ \ }\n}\n```" repo_url: https://huggingface.co/Envoid/Yousei-22B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-09-41.852615.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T01_22_19.156649 path: - '**/details_harness|drop|3_2023-10-26T01-22-19.156649.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T01-22-19.156649.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T01_22_19.156649 path: - '**/details_harness|gsm8k|5_2023-10-26T01-22-19.156649.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T01-22-19.156649.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hellaswag|10_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_09_41.852615 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-09-41.852615.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-09-41.852615.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T01_22_19.156649 path: - '**/details_harness|winogrande|5_2023-10-26T01-22-19.156649.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T01-22-19.156649.parquet' - config_name: results data_files: - split: 2023_10_10T13_09_41.852615 path: - results_2023-10-10T13-09-41.852615.parquet - split: 2023_10_26T01_22_19.156649 path: - results_2023-10-26T01-22-19.156649.parquet - split: latest path: - results_2023-10-26T01-22-19.156649.parquet --- # Dataset Card for Evaluation run of Envoid/Yousei-22B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Envoid/Yousei-22B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Envoid/Yousei-22B](https://huggingface.co/Envoid/Yousei-22B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Envoid__Yousei-22B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T01:22:19.156649](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Yousei-22B/blob/main/results_2023-10-26T01-22-19.156649.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.15604026845637584, "em_stderr": 0.003716369253387427, "f1": 0.23708158557046954, "f1_stderr": 0.003820843210859161, "acc": 0.3598119404753428, "acc_stderr": 0.007269770584572424 }, "harness|drop|3": { "em": 0.15604026845637584, "em_stderr": 0.003716369253387427, "f1": 0.23708158557046954, "f1_stderr": 0.003820843210859161 }, "harness|gsm8k|5": { "acc": 0.004548900682335102, "acc_stderr": 0.0018535550440036204 }, "harness|winogrande|5": { "acc": 0.7150749802683505, "acc_stderr": 0.012685986125141229 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,520
[ [ -0.033935546875, -0.052001953125, 0.00965118408203125, 0.0177459716796875, -0.01104736328125, 0.00823974609375, -0.023590087890625, -0.012451171875, 0.0318603515625, 0.04449462890625, -0.0494384765625, -0.07049560546875, -0.0477294921875, 0.01241302490234375...
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft
2023-10-28T02:49:54.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:26:38
--- pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T02:49:42.173825](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-28T02-49-42.173825.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\ \ \"em_stderr\": 0.0003314581465219189,\n \"f1\": 0.05492764261744986,\n\ \ \"f1_stderr\": 0.0012887827966655012,\n \"acc\": 0.4163294284912454,\n\ \ \"acc_stderr\": 0.009719919588691044\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219189,\n\ \ \"f1\": 0.05492764261744986,\n \"f1_stderr\": 0.0012887827966655012\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \ \ \"acc_stderr\": 0.00735771352322235\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\ \ }\n}\n```" repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T02_49_42.173825 path: - '**/details_harness|drop|3_2023-10-28T02-49-42.173825.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T02-49-42.173825.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T02_49_42.173825 path: - '**/details_harness|gsm8k|5_2023-10-28T02-49-42.173825.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T02-49-42.173825.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_26_13.835261 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T02_49_42.173825 path: - '**/details_harness|winogrande|5_2023-10-28T02-49-42.173825.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T02-49-42.173825.parquet' - config_name: results data_files: - split: 2023_10_10T13_26_13.835261 path: - results_2023-10-10T13-26-13.835261.parquet - split: 2023_10_28T02_49_42.173825 path: - results_2023-10-28T02-49-42.173825.parquet - split: latest path: - results_2023-10-28T02-49-42.173825.parquet --- # Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T02:49:42.173825](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-28T02-49-42.173825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219189, "f1": 0.05492764261744986, "f1_stderr": 0.0012887827966655012, "acc": 0.4163294284912454, "acc_stderr": 0.009719919588691044 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219189, "f1": 0.05492764261744986, "f1_stderr": 0.0012887827966655012 }, "harness|gsm8k|5": { "acc": 0.07733131159969674, "acc_stderr": 0.00735771352322235 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,732
[ [ -0.0293731689453125, -0.05023193359375, 0.0188140869140625, 0.0244598388671875, -0.01953125, 0.006134033203125, -0.03033447265625, -0.02386474609375, 0.03515625, 0.040374755859375, -0.048675537109375, -0.0677490234375, -0.04718017578125, 0.020721435546875, ...
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft
2023-10-25T13:52:05.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T13:33:15
--- pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T13:51:53.444348](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-25T13-51-53.444348.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\ acc\": 0.2478295185477506,\n \"acc_stderr\": 0.007025978032038448\n },\n\ \ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\ \ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\ : {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076896\n\ \ }\n}\n```" repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T13_51_53.444348 path: - '**/details_harness|drop|3_2023-10-25T13-51-53.444348.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T13-51-53.444348.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T13_51_53.444348 path: - '**/details_harness|gsm8k|5_2023-10-25T13-51-53.444348.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T13-51-53.444348.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T13_32_51.379088 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T13_51_53.444348 path: - '**/details_harness|winogrande|5_2023-10-25T13-51-53.444348.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T13-51-53.444348.parquet' - config_name: results data_files: - split: 2023_10_10T13_32_51.379088 path: - results_2023-10-10T13-32-51.379088.parquet - split: 2023_10_25T13_51_53.444348 path: - results_2023-10-25T13-51-53.444348.parquet - split: latest path: - results_2023-10-25T13-51-53.444348.parquet --- # Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T13:51:53.444348](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-25T13-51-53.444348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0, "acc": 0.2478295185477506, "acc_stderr": 0.007025978032038448 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4956590370955012, "acc_stderr": 0.014051956064076896 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,377
[ [ -0.0283050537109375, -0.04986572265625, 0.018096923828125, 0.0253143310546875, -0.02105712890625, 0.005889892578125, -0.032958984375, -0.0244903564453125, 0.036590576171875, 0.042266845703125, -0.050628662109375, -0.06964111328125, -0.047576904296875, 0.0221...
Coroseven/MarinKitagawa
2023-10-10T14:02:01.000Z
[ "region:us" ]
Coroseven
null
null
0
0
2023-10-10T14:00:21
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf
2023-10-10T14:03:00.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T14:01:58
--- pretty_name: Evaluation run of Community-LM/llava-v1.5-13b-hf dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Community-LM/llava-v1.5-13b-hf](https://huggingface.co/Community-LM/llava-v1.5-13b-hf)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T14:01:34.065508](https://huggingface.co/datasets/open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf/blob/main/results_2023-10-10T14-01-34.065508.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5687974861474466,\n\ \ \"acc_stderr\": 0.034102420636387375,\n \"acc_norm\": 0.5727205361494934,\n\ \ \"acc_norm_stderr\": 0.034085436281331656,\n \"mc1\": 0.3011015911872705,\n\ \ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.433460825483405,\n\ \ \"mc2_stderr\": 0.01517244922847158\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.01458063756999542,\n\ \ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6011750647281418,\n\ \ \"acc_stderr\": 0.004886559008754983,\n \"acc_norm\": 0.8036247759410476,\n\ \ \"acc_norm_stderr\": 0.003964437012249994\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\ \ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\ \ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \ \ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\ \ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\ \ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\ \ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\ \ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\ \ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\ \ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\ \ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.328042328042328,\n \"acc_stderr\": 0.0241804971643769,\n \"acc_norm\"\ : 0.328042328042328,\n \"acc_norm_stderr\": 0.0241804971643769\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\ \ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\ \ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\ \ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\ \ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\ acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397433,\n\ \ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397433\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\ \ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\ \ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\ acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7577981651376147,\n \"acc_stderr\": 0.018368176306598618,\n \"\ acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.018368176306598618\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\ acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\ acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \ \ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\ \ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\ : 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\ \ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\ \ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\ \ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\ \ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\ \ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\ \ \"acc_stderr\": 0.014957458504335835,\n \"acc_norm\": 0.7739463601532567,\n\ \ \"acc_norm_stderr\": 0.014957458504335835\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\ \ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\ \ \"acc_stderr\": 0.015652542496421114,\n \"acc_norm\": 0.3240223463687151,\n\ \ \"acc_norm_stderr\": 0.015652542496421114\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n\ \ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\ \ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\ \ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n\ \ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \ \ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\ \ \"acc_stderr\": 0.012588323850313608,\n \"acc_norm\": 0.41590612777053454,\n\ \ \"acc_norm_stderr\": 0.012588323850313608\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n\ \ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \ \ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\ \ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\ \ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\ \ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \ \ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\ \ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\ \ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.433460825483405,\n\ \ \"mc2_stderr\": 0.01517244922847158\n }\n}\n```" repo_url: https://huggingface.co/Community-LM/llava-v1.5-13b-hf leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|arc:challenge|25_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hellaswag|10_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T14_01_34.065508 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-01-34.065508.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-01-34.065508.parquet' - config_name: results data_files: - split: 2023_10_10T14_01_34.065508 path: - results_2023-10-10T14-01-34.065508.parquet - split: latest path: - results_2023-10-10T14-01-34.065508.parquet --- # Dataset Card for Evaluation run of Community-LM/llava-v1.5-13b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Community-LM/llava-v1.5-13b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Community-LM/llava-v1.5-13b-hf](https://huggingface.co/Community-LM/llava-v1.5-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T14:01:34.065508](https://huggingface.co/datasets/open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf/blob/main/results_2023-10-10T14-01-34.065508.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5687974861474466, "acc_stderr": 0.034102420636387375, "acc_norm": 0.5727205361494934, "acc_norm_stderr": 0.034085436281331656, "mc1": 0.3011015911872705, "mc1_stderr": 0.016058999026100612, "mc2": 0.433460825483405, "mc2_stderr": 0.01517244922847158 }, "harness|arc:challenge|25": { "acc": 0.5324232081911263, "acc_stderr": 0.01458063756999542, "acc_norm": 0.5614334470989761, "acc_norm_stderr": 0.014500682618212864 }, "harness|hellaswag|10": { "acc": 0.6011750647281418, "acc_stderr": 0.004886559008754983, "acc_norm": 0.8036247759410476, "acc_norm_stderr": 0.003964437012249994 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5855263157894737, "acc_stderr": 0.04008973785779206, "acc_norm": 0.5855263157894737, "acc_norm_stderr": 0.04008973785779206 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325582, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325582 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.043898699568087764, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.043898699568087764 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.44680851063829785, "acc_stderr": 0.0325005368436584, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.328042328042328, "acc_stderr": 0.0241804971643769, "acc_norm": 0.328042328042328, "acc_norm_stderr": 0.0241804971643769 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7129032258064516, "acc_stderr": 0.025736542745594528, "acc_norm": 0.7129032258064516, "acc_norm_stderr": 0.025736542745594528 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03481904844438803, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03481904844438803 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7151515151515152, "acc_stderr": 0.03524390844511781, "acc_norm": 0.7151515151515152, "acc_norm_stderr": 0.03524390844511781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.031156269519646836, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.031156269519646836 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397433, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5384615384615384, "acc_stderr": 0.025275892070240644, "acc_norm": 0.5384615384615384, "acc_norm_stderr": 0.025275892070240644 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066475, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066475 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.032183581077426124, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.032183581077426124 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.03603038545360384, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.03603038545360384 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7577981651376147, "acc_stderr": 0.018368176306598618, "acc_norm": 0.7577981651376147, "acc_norm_stderr": 0.018368176306598618 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.03372343271653063, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.03372343271653063 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501947, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501947 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776678, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776678 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6319018404907976, "acc_stderr": 0.03789213935838396, "acc_norm": 0.6319018404907976, "acc_norm_stderr": 0.03789213935838396 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384493, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384493 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.02363687331748928, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.02363687331748928 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7739463601532567, "acc_stderr": 0.014957458504335835, "acc_norm": 0.7739463601532567, "acc_norm_stderr": 0.014957458504335835 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6271676300578035, "acc_stderr": 0.02603389061357628, "acc_norm": 0.6271676300578035, "acc_norm_stderr": 0.02603389061357628 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3240223463687151, "acc_stderr": 0.015652542496421114, "acc_norm": 0.3240223463687151, "acc_norm_stderr": 0.015652542496421114 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6078431372549019, "acc_stderr": 0.027956046165424523, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.027956046165424523 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6975308641975309, "acc_stderr": 0.025557653981868045, "acc_norm": 0.6975308641975309, "acc_norm_stderr": 0.025557653981868045 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4078014184397163, "acc_stderr": 0.029316011776343555, "acc_norm": 0.4078014184397163, "acc_norm_stderr": 0.029316011776343555 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41590612777053454, "acc_stderr": 0.012588323850313608, "acc_norm": 0.41590612777053454, "acc_norm_stderr": 0.012588323850313608 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5477941176470589, "acc_stderr": 0.030233758551596445, "acc_norm": 0.5477941176470589, "acc_norm_stderr": 0.030233758551596445 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5784313725490197, "acc_stderr": 0.019977422600227477, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.019977422600227477 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726496, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726496 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7611940298507462, "acc_stderr": 0.03014777593540922, "acc_norm": 0.7611940298507462, "acc_norm_stderr": 0.03014777593540922 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.3011015911872705, "mc1_stderr": 0.016058999026100612, "mc2": 0.433460825483405, "mc2_stderr": 0.01517244922847158 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,933
[ [ -0.049591064453125, -0.060394287109375, 0.0197601318359375, 0.01288604736328125, -0.01021575927734375, -0.0004642009735107422, 0.0015583038330078125, -0.01434326171875, 0.04339599609375, -0.0037937164306640625, -0.03497314453125, -0.0474853515625, -0.03170776367...
Coroseven/YotsubaNakano
2023-10-10T14:10:30.000Z
[ "region:us" ]
Coroseven
null
null
0
0
2023-10-10T14:08:49
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private
2023-10-10T14:11:47.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T14:11:34
--- pretty_name: Evaluation run of HuggingFaceH4/zephyr-7b-alpha dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [HuggingFaceH4/zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-10T14:11:13.991325](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private/blob/main/results_2023-10-10T14-11-13.991325.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6137978230566867,\n\ \ \"acc_stderr\": 0.03380754595328641,\n \"acc_norm\": 0.6176702382672306,\n\ \ \"acc_norm_stderr\": 0.03378555360789072,\n \"mc1\": 0.42717258261933905,\n\ \ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5790339154881958,\n\ \ \"mc2_stderr\": 0.015362629183533977\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639011,\n\ \ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.01425295984889289\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6409081856203943,\n\ \ \"acc_stderr\": 0.004787537385153006,\n \"acc_norm\": 0.8403704441346346,\n\ \ \"acc_norm_stderr\": 0.0036551361115537096\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\ \ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\ \ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\ \ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\ \ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\ \ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\ \ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\ \ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\ \ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\ \ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613761,\n \"\ acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613761\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\ \ \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.7548387096774194,\n\ \ \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\ \ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\ acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\ \ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \ \ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \ \ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\ acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"\ acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\ acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460285,\n \ \ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460285\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\ \ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\ \ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\ \ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\ \ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\ \ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\ \ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\ \ \"acc_stderr\": 0.014551310568143704,\n \"acc_norm\": 0.7905491698595147,\n\ \ \"acc_norm_stderr\": 0.014551310568143704\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\ \ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\ \ \"acc_stderr\": 0.01615591072134177,\n \"acc_norm\": 0.37094972067039106,\n\ \ \"acc_norm_stderr\": 0.01615591072134177\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\ \ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\ \ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \ \ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41003911342894395,\n\ \ \"acc_stderr\": 0.012561837621962044,\n \"acc_norm\": 0.41003911342894395,\n\ \ \"acc_norm_stderr\": 0.012561837621962044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\ \ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \ \ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\ \ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\ \ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\ \ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n\ \ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5790339154881958,\n\ \ \"mc2_stderr\": 0.015362629183533977\n }\n}\n```" repo_url: https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|arc:challenge|25_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hellaswag|10_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T14_11_13.991325 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-13.991325.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-13.991325.parquet' - config_name: results data_files: - split: 2023_10_10T14_11_13.991325 path: - results_2023-10-10T14-11-13.991325.parquet - split: latest path: - results_2023-10-10T14-11-13.991325.parquet --- # Dataset Card for Evaluation run of HuggingFaceH4/zephyr-7b-alpha ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [HuggingFaceH4/zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-10T14:11:13.991325](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private/blob/main/results_2023-10-10T14-11-13.991325.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6137978230566867, "acc_stderr": 0.03380754595328641, "acc_norm": 0.6176702382672306, "acc_norm_stderr": 0.03378555360789072, "mc1": 0.42717258261933905, "mc1_stderr": 0.017316834410963926, "mc2": 0.5790339154881958, "mc2_stderr": 0.015362629183533977 }, "harness|arc:challenge|25": { "acc": 0.5810580204778157, "acc_stderr": 0.014418106953639011, "acc_norm": 0.6100682593856656, "acc_norm_stderr": 0.01425295984889289 }, "harness|hellaswag|10": { "acc": 0.6409081856203943, "acc_stderr": 0.004787537385153006, "acc_norm": 0.8403704441346346, "acc_norm_stderr": 0.0036551361115537096 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.038947344870133176, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.038947344870133176 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5319148936170213, "acc_stderr": 0.03261936918467382, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.02501074911613761, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.02501074911613761 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.02447224384089553, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.02447224384089553 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6153846153846154, "acc_stderr": 0.024666744915187208, "acc_norm": 0.6153846153846154, "acc_norm_stderr": 0.024666744915187208 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857403, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857403 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.03128217706368461, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.03128217706368461 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217902, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217902 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.033981108902946366, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.033981108902946366 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.028458820991460285, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.028458820991460285 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.032277904428505, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.032277904428505 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302871, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615623, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7905491698595147, "acc_stderr": 0.014551310568143704, "acc_norm": 0.7905491698595147, "acc_norm_stderr": 0.014551310568143704 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6994219653179191, "acc_stderr": 0.0246853168672578, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.0246853168672578 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37094972067039106, "acc_stderr": 0.01615591072134177, "acc_norm": 0.37094972067039106, "acc_norm_stderr": 0.01615591072134177 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6862745098039216, "acc_stderr": 0.02656892101545715, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.02656892101545715 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818774, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818774 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6790123456790124, "acc_stderr": 0.02597656601086274, "acc_norm": 0.6790123456790124, "acc_norm_stderr": 0.02597656601086274 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41003911342894395, "acc_stderr": 0.012561837621962044, "acc_norm": 0.41003911342894395, "acc_norm_stderr": 0.012561837621962044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824866, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824866 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6290849673202614, "acc_stderr": 0.019542101564854128, "acc_norm": 0.6290849673202614, "acc_norm_stderr": 0.019542101564854128 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982066, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982066 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8009950248756219, "acc_stderr": 0.028231365092758406, "acc_norm": 0.8009950248756219, "acc_norm_stderr": 0.028231365092758406 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.42717258261933905, "mc1_stderr": 0.017316834410963926, "mc2": 0.5790339154881958, "mc2_stderr": 0.015362629183533977 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,023
[ [ -0.049224853515625, -0.058807373046875, 0.0191802978515625, 0.0155487060546875, -0.01038360595703125, -0.004451751708984375, 0.002532958984375, -0.017425537109375, 0.040771484375, -0.005645751953125, -0.03350830078125, -0.046844482421875, -0.030120849609375, ...
open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0
2023-10-29T00:51:29.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-10T14:12:22
--- pretty_name: Evaluation run of uukuguy/speechless-tora-code-7b-v1.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [uukuguy/speechless-tora-code-7b-v1.0](https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-29T00:51:17.507006](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0/blob/main/results_2023-10-29T00-51-17.507006.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23468959731543623,\n\ \ \"em_stderr\": 0.004340156396807698,\n \"f1\": 0.2847546140939602,\n\ \ \"f1_stderr\": 0.004356308687759715,\n \"acc\": 0.31907139476284024,\n\ \ \"acc_stderr\": 0.00809586320650362\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.23468959731543623,\n \"em_stderr\": 0.004340156396807698,\n\ \ \"f1\": 0.2847546140939602,\n \"f1_stderr\": 0.004356308687759715\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \ \ \"acc_stderr\": 0.0026153265107756725\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6290449881610103,\n \"acc_stderr\": 0.013576399902231568\n\ \ }\n}\n```" repo_url: https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|arc:challenge|25_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-10T14-11-59.032357.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_29T00_51_17.507006 path: - '**/details_harness|drop|3_2023-10-29T00-51-17.507006.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-29T00-51-17.507006.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_29T00_51_17.507006 path: - '**/details_harness|gsm8k|5_2023-10-29T00-51-17.507006.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-29T00-51-17.507006.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hellaswag|10_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_10T14_11_59.032357 path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-59.032357.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-59.032357.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_29T00_51_17.507006 path: - '**/details_harness|winogrande|5_2023-10-29T00-51-17.507006.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-29T00-51-17.507006.parquet' - config_name: results data_files: - split: 2023_10_10T14_11_59.032357 path: - results_2023-10-10T14-11-59.032357.parquet - split: 2023_10_29T00_51_17.507006 path: - results_2023-10-29T00-51-17.507006.parquet - split: latest path: - results_2023-10-29T00-51-17.507006.parquet --- # Dataset Card for Evaluation run of uukuguy/speechless-tora-code-7b-v1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-tora-code-7b-v1.0](https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T00:51:17.507006](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0/blob/main/results_2023-10-29T00-51-17.507006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.23468959731543623, "em_stderr": 0.004340156396807698, "f1": 0.2847546140939602, "f1_stderr": 0.004356308687759715, "acc": 0.31907139476284024, "acc_stderr": 0.00809586320650362 }, "harness|drop|3": { "em": 0.23468959731543623, "em_stderr": 0.004340156396807698, "f1": 0.2847546140939602, "f1_stderr": 0.004356308687759715 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.0026153265107756725 }, "harness|winogrande|5": { "acc": 0.6290449881610103, "acc_stderr": 0.013576399902231568 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,744
[ [ -0.02593994140625, -0.042572021484375, 0.0198211669921875, 0.0233154296875, -0.019622802734375, 0.01253509521484375, -0.0225830078125, -0.011749267578125, 0.0310516357421875, 0.04534912109375, -0.044708251953125, -0.066650390625, -0.04254150390625, 0.0149688...