datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
Edopangui/promo3
--- license: apache-2.0 ---
Uchenna/Testdataset
--- license: mit language: - en dataset_info: features: - name: product dtype: string - name: description dtype: string - name: advert dtype: string - name: ad dtype: string splits: - name: train num_bytes: 5766 num_examples: 10 download_size: 9580 dataset_size: 5766 ---
AlekseyKorshuk/code-alpaca-eval-debug-completions
--- dataset_info: features: - name: model_input list: - name: content dtype: string - name: role dtype: string - name: baseline_response dtype: string - name: completion dtype: string splits: - name: train num_bytes: 1548 num_examples: 2 download_size: 7523 dataset_size: 1548 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_KevinNi__mistral-class-bio-tutor
--- pretty_name: Evaluation run of KevinNi/mistral-class-bio-tutor dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [KevinNi/mistral-class-bio-tutor](https://huggingface.co/KevinNi/mistral-class-bio-tutor)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 1 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KevinNi__mistral-class-bio-tutor\"\ ,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\ \ are the [latest results from run 2023-12-02T15:48:30.567817](https://huggingface.co/datasets/open-llm-leaderboard/details_KevinNi__mistral-class-bio-tutor/blob/main/results_2023-12-02T15-48-30.567817.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\ acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \ \ \"acc_stderr\": 0.0\n }\n}\n```" repo_url: https://huggingface.co/KevinNi/mistral-class-bio-tutor leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_gsm8k_5 data_files: - split: 2023_12_02T15_48_30.567817 path: - '**/details_harness|gsm8k|5_2023-12-02T15-48-30.567817.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-02T15-48-30.567817.parquet' - config_name: results data_files: - split: 2023_12_02T15_48_30.567817 path: - results_2023-12-02T15-48-30.567817.parquet - split: latest path: - results_2023-12-02T15-48-30.567817.parquet --- # Dataset Card for Evaluation run of KevinNi/mistral-class-bio-tutor ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KevinNi/mistral-class-bio-tutor - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [KevinNi/mistral-class-bio-tutor](https://huggingface.co/KevinNi/mistral-class-bio-tutor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KevinNi__mistral-class-bio-tutor", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-02T15:48:30.567817](https://huggingface.co/datasets/open-llm-leaderboard/details_KevinNi__mistral-class-bio-tutor/blob/main/results_2023-12-02T15-48-30.567817.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
AdapterOcean/med_alpaca_standardized_cluster_38_alpaca
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 26390473 num_examples: 13489 download_size: 13464677 dataset_size: 26390473 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_38_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CesarChaMal/my-personal-model
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 41999 num_examples: 426 - name: test num_bytes: 12984 num_examples: 175 download_size: 29597 dataset_size: 54983 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
UtkuC/utku_test_dog
--- license: mit dataset_info: features: - name: image dtype: string - name: label dtype: string splits: - name: train num_bytes: 406 num_examples: 7 download_size: 1482 dataset_size: 406 configs: - config_name: default data_files: - split: train path: data/train-* ---
TrainingDataPro/crowd-counting-dataset
--- license: cc-by-nc-nd-4.0 task_categories: - image-classification - image-to-image language: - en tags: - legal - code --- # Crowd Counting Dataset The dataset includes images featuring crowds of people ranging from **0 to 5000 individuals**. The dataset includes a diverse range of scenes and scenarios, capturing crowds in various settings. Each image in the dataset is accompanied by a corresponding **JSON file** containing detailed labeling information for each person in the crowd for crowd count and classification. ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F4b51a212e59f575bd6978f215a32aca0%2FFrame%2064.png?generation=1701336719197861&alt=media) **Types of crowds** in the dataset: *0-1000, 1000-2000, 2000-3000, 3000-4000 and 4000-5000* ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F72e0fed3ad13826d6545ff75a79ed9db%2FFrame%2065.png?generation=1701337622225724&alt=media) This dataset provides a valuable resource for researchers and developers working on crowd counting technology, enabling them to train and evaluate their algorithms with a wide range of crowd sizes and scenarios. It can also be used for benchmarking and comparison of different crowd counting algorithms, as well as for real-world applications such as *public safety and security, urban planning, and retail analytics*. ## Full version of the dataset includes 647 labeled images of crowds, leave a request on **[TrainingData](https://trainingdata.pro/data-market/crowd-counting?utm_source=huggingface&utm_medium=cpc&utm_campaign=crowd-counting-dataset)** to buy the dataset ### Statistics for the dataset (number of images by the crowd's size and image width): ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F2e9f36820e62a2ef62586fc8e84387e2%2FFrame%2063.png?generation=1701336725293625&alt=media) # Get the Dataset ## This is just an example of the data Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/crowd-counting?utm_source=huggingface&utm_medium=cpc&utm_campaign=crowd-counting-dataset) to learn about the price and buy the dataset** # Content - **images** - includes original images of crowds placed in subfolders according to its size, - **labels** - includes json-files with labeling and visualised labeling for the images in the previous folder, - **csv file** - includes information for each image in the dataset ### File with the extension .csv - **id**: id of the image, - **image**: link to access the original image, - **label**: link to access the json-file with labeling, - **type**: type of the crowd on the photo ## **[TrainingData](https://trainingdata.pro/data-market/crowd-counting?utm_source=huggingface&utm_medium=cpc&utm_campaign=crowd-counting-dataset)** provides high-quality data annotation tailored to your needs More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>** TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** *keywords: crowd counting, crowd density estimation, people counting, crowd analysis, image annotation, computer vision, deep learning, object detection, object counting, image classification, dense regression, crowd behavior analysis, crowd tracking, head detection, crowd segmentation, crowd motion analysis, image processing, machine learning, artificial intelligence, ai, human detection, crowd sensing, image dataset, public safety, crowd management, urban planning, event planning, traffic management*
nastyboget/stackmix_cyrillic
--- license: mit task_categories: - image-to-text language: - ru size_categories: - 100K<n<1M --- Dataset generated from cyrillic train set using Stackmix ======================================================== Number of images: 300000 Sources: * [Cyrillic dataset](https://www.kaggle.com/datasets/constantinwerner/cyrillic-handwriting-dataset) * [Stackmix code](https://github.com/ai-forever/StackMix-OCR)
yaygomii/Dialect_Speech_Corpus_Tamil_with_info
--- dataset_info: features: - name: label dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: sentence dtype: string - name: gender dtype: string - name: dialect dtype: string splits: - name: train num_bytes: 3019481844.752 num_examples: 10812 download_size: 2873255375 dataset_size: 3019481844.752 configs: - config_name: default data_files: - split: train path: data/train-* ---
heliosprime/twitter_dataset_1713007117
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 7492 num_examples: 16 download_size: 8456 dataset_size: 7492 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713007117" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kristinashemet/Knowledge_Based_Questions-Anwers_with_text_from_doc-Part1-24.03
--- dataset_info: features: - name: formatted_data dtype: string splits: - name: train num_bytes: 247196 num_examples: 532 download_size: 65541 dataset_size: 247196 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_chargoddard__piano-medley-7b
--- pretty_name: Evaluation run of chargoddard/piano-medley-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/piano-medley-7b](https://huggingface.co/chargoddard/piano-medley-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__piano-medley-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-10T03:24:54.482171](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__piano-medley-7b/blob/main/results_2023-12-10T03-24-54.482171.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6462767300930756,\n\ \ \"acc_stderr\": 0.032134853847514466,\n \"acc_norm\": 0.6489933678568897,\n\ \ \"acc_norm_stderr\": 0.03277330582106223,\n \"mc1\": 0.44063647490820074,\n\ \ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6142054505900651,\n\ \ \"mc2_stderr\": 0.015456544162012987\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n\ \ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518826\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6645090619398526,\n\ \ \"acc_stderr\": 0.004711968379069029,\n \"acc_norm\": 0.8536148177653854,\n\ \ \"acc_norm_stderr\": 0.0035276951498235004\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\ \ \"acc_stderr\": 0.04115324610336953,\n \"acc_norm\": 0.6518518518518519,\n\ \ \"acc_norm_stderr\": 0.04115324610336953\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\ \ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\ \ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\ \ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\ \ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\ \ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\ \ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\ acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\ \ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \ \ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\ : 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n\ \ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\ \ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126253,\n \"\ acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126253\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\ \ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\ \ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\ \ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\ \ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\ \ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\ \ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\ \ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\ \ \"acc_stderr\": 0.013547415658662253,\n \"acc_norm\": 0.8263090676883781,\n\ \ \"acc_norm_stderr\": 0.013547415658662253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n\ \ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\ \ \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n\ \ \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\ \ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\ \ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\ \ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\ \ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \ \ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\ \ \"acc_stderr\": 0.01271540484127774,\n \"acc_norm\": 0.45371577574967403,\n\ \ \"acc_norm_stderr\": 0.01271540484127774\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\ \ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \ \ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\ \ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\ \ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\ \ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\ \ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\ \ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\ \ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6142054505900651,\n\ \ \"mc2_stderr\": 0.015456544162012987\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5655799848369977,\n \ \ \"acc_stderr\": 0.013653507211411417\n }\n}\n```" repo_url: https://huggingface.co/chargoddard/piano-medley-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|arc:challenge|25_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-10T03-24-54.482171.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|gsm8k|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hellaswag|10_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|truthfulqa:mc|0_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-10T03-24-54.482171.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_10T03_24_54.482171 path: - '**/details_harness|winogrande|5_2023-12-10T03-24-54.482171.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-10T03-24-54.482171.parquet' - config_name: results data_files: - split: 2023_12_10T03_24_54.482171 path: - results_2023-12-10T03-24-54.482171.parquet - split: latest path: - results_2023-12-10T03-24-54.482171.parquet --- # Dataset Card for Evaluation run of chargoddard/piano-medley-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chargoddard/piano-medley-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [chargoddard/piano-medley-7b](https://huggingface.co/chargoddard/piano-medley-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__piano-medley-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-10T03:24:54.482171](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__piano-medley-7b/blob/main/results_2023-12-10T03-24-54.482171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6462767300930756, "acc_stderr": 0.032134853847514466, "acc_norm": 0.6489933678568897, "acc_norm_stderr": 0.03277330582106223, "mc1": 0.44063647490820074, "mc1_stderr": 0.017379697555437446, "mc2": 0.6142054505900651, "mc2_stderr": 0.015456544162012987 }, "harness|arc:challenge|25": { "acc": 0.6399317406143344, "acc_stderr": 0.014027516814585186, "acc_norm": 0.6757679180887372, "acc_norm_stderr": 0.013678810399518826 }, "harness|hellaswag|10": { "acc": 0.6645090619398526, "acc_stderr": 0.004711968379069029, "acc_norm": 0.8536148177653854, "acc_norm_stderr": 0.0035276951498235004 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.04115324610336953, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.04115324610336953 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.02863723563980089, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.02863723563980089 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.02275520495954294, "acc_norm": 0.8, "acc_norm_stderr": 0.02275520495954294 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.02833560973246336, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.02833560973246336 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6794871794871795, "acc_stderr": 0.023661296393964273, "acc_norm": 0.6794871794871795, "acc_norm_stderr": 0.023661296393964273 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948482, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.029837962388291932, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.029837962388291932 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8532110091743119, "acc_stderr": 0.015173141845126253, "acc_norm": 0.8532110091743119, "acc_norm_stderr": 0.015173141845126253 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7130044843049327, "acc_stderr": 0.03036037971029195, "acc_norm": 0.7130044843049327, "acc_norm_stderr": 0.03036037971029195 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507332, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507332 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.013547415658662253, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.013547415658662253 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577612, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577612 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.01649540063582008, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.01649540063582008 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.02465968518596728, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.02465968518596728 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45371577574967403, "acc_stderr": 0.01271540484127774, "acc_norm": 0.45371577574967403, "acc_norm_stderr": 0.01271540484127774 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.028064998167040094, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.028064998167040094 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306053, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306053 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072767, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072767 }, "harness|truthfulqa:mc|0": { "mc1": 0.44063647490820074, "mc1_stderr": 0.017379697555437446, "mc2": 0.6142054505900651, "mc2_stderr": 0.015456544162012987 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987729 }, "harness|gsm8k|5": { "acc": 0.5655799848369977, "acc_stderr": 0.013653507211411417 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
CyberHarem/naga_nikke
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory) This is the dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory), containing 82 images and their tags. The core tags of this character are `breasts, bangs, large_breasts, hair_over_one_eye, long_hair, dark_skin, hair_ornament, brown_hair, dark-skinned_female, wrist_scrunchie, scrunchie, yellow_eyes, brown_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 82 | 139.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 82 | 67.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 201 | 145.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 82 | 117.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 201 | 226.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/naga_nikke', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blue_skirt, collared_shirt, looking_at_viewer, school_uniform, short_sleeves, simple_background, solo, white_background, white_shirt, navel, pleated_skirt, blush, jewelry, midriff, black_choker, crop_top, ear_piercing, thighs, breast_pocket, purple_scrunchie, thigh_strap, black_nails, lifted_by_self, skirt_lift, mouth_hold, parted_lips, smile, striped, cowboy_shot, hair_ribbon, miniskirt, purple_necktie, stomach | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_skirt | collared_shirt | looking_at_viewer | school_uniform | short_sleeves | simple_background | solo | white_background | white_shirt | navel | pleated_skirt | blush | jewelry | midriff | black_choker | crop_top | ear_piercing | thighs | breast_pocket | purple_scrunchie | thigh_strap | black_nails | lifted_by_self | skirt_lift | mouth_hold | parted_lips | smile | striped | cowboy_shot | hair_ribbon | miniskirt | purple_necktie | stomach | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------|:--------------------|:-----------------|:----------------|:--------------------|:-------|:-------------------|:--------------|:--------|:----------------|:--------|:----------|:----------|:---------------|:-----------|:---------------|:---------|:----------------|:-------------------|:--------------|:--------------|:-----------------|:-------------|:-------------|:--------------|:--------|:----------|:--------------|:--------------|:------------|:-----------------|:----------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
FastFit/clinc_150
--- dataset_info: features: - name: label dtype: string - name: text dtype: string splits: - name: train num_bytes: 895435 num_examples: 15000 - name: validation num_bytes: 178841 num_examples: 3000 - name: test num_bytes: 265940 num_examples: 4500 download_size: 469678 dataset_size: 1340216 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
freshpearYoon/train_free_37
--- dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 9604553224 num_examples: 10000 download_size: 1274731370 dataset_size: 9604553224 configs: - config_name: default data_files: - split: train path: data/train-* ---
heliosprime/twitter_dataset_1713032115
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 12085 num_examples: 28 download_size: 9997 dataset_size: 12085 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713032115" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilabel-internal-testing/instruction-backtranslation-mini
--- dataset_info: features: - name: instruction dtype: string - name: generation dtype: string - name: generation_model dtype: string - name: score dtype: int64 - name: reason dtype: string - name: scoring_model dtype: string splits: - name: train num_bytes: 12446 num_examples: 10 download_size: 15973 dataset_size: 12446 configs: - config_name: default data_files: - split: train path: data/train-* ---
ondevicellm/SlimOrca
--- dataset_info: features: - name: conversations list: - name: from dtype: string - name: value dtype: string - name: weight dtype: float64 - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 1838092668 num_examples: 517982 download_size: 930649170 dataset_size: 1838092668 configs: - config_name: default data_files: - split: train path: data/train-* ---
growth-cadet/eval_mistral_jobpost
--- dataset_info: features: - name: id dtype: string - name: ats dtype: string - name: context dtype: string - name: context_token_count dtype: int64 - name: gpt-4_response dtype: string - name: gpt-4_cost dtype: float64 - name: gpt-4_sys5_response dtype: string - name: gpt-4_sys5_cost dtype: float64 - name: sys5_obj struct: - name: focus_areas list: - name: description dtype: string - name: subject dtype: string - name: industries list: - name: description dtype: string - name: subject dtype: string - name: products_and_technologies list: - name: description dtype: string - name: subject dtype: string - name: mistral01_gen dtype: string - name: eval_crit struct: - name: focus_areas dtype: float64 - name: industries dtype: float64 - name: products_and_technologies dtype: float64 splits: - name: test num_bytes: 18603269 num_examples: 1806 download_size: 8491351 dataset_size: 18603269 configs: - config_name: default data_files: - split: test path: data/test-* ---
bongsoo/social_science_en_ko
--- language: - ko license: apache-2.0 --- - 사회과학-en-ko 번역 말뭉치
hazal/Turkish-Biomedical-corpus-trM
--- language: - tr ---
open-llm-leaderboard/details_gagan3012__Multilingual-mistral
--- pretty_name: Evaluation run of gagan3012/Multilingual-mistral dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [gagan3012/Multilingual-mistral](https://huggingface.co/gagan3012/Multilingual-mistral)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__Multilingual-mistral\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-16T13:00:17.256624](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multilingual-mistral/blob/main/results_2024-01-16T13-00-17.256624.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6125296970145792,\n\ \ \"acc_stderr\": 0.032987213837153834,\n \"acc_norm\": 0.6173816622782963,\n\ \ \"acc_norm_stderr\": 0.03365203917647103,\n \"mc1\": 0.40024479804161567,\n\ \ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.555285737063638,\n\ \ \"mc2_stderr\": 0.015637031939929425\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\ \ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192593\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6274646484763992,\n\ \ \"acc_stderr\": 0.004824917516374184,\n \"acc_norm\": 0.8175662218681538,\n\ \ \"acc_norm_stderr\": 0.00385412337350911\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\ \ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\ \ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \ \ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n\ \ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\ \ \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n\ \ \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\ \ 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"\ acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\ \ \"acc_stderr\": 0.046570472605949646,\n \"acc_norm\": 0.4298245614035088,\n\ \ \"acc_norm_stderr\": 0.046570472605949646\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\ \ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\ acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5903225806451613,\n\ \ \"acc_stderr\": 0.02797605491534735,\n \"acc_norm\": 0.5903225806451613,\n\ \ \"acc_norm_stderr\": 0.02797605491534735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\ \ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\ \ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\ \ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n\ \ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\ acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848023,\n \"\ acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848023\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\ acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \ \ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\ \ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\ \ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n\ \ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\ \ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\ \ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\ \ \"acc_stderr\": 0.014551310568143705,\n \"acc_norm\": 0.7905491698595147,\n\ \ \"acc_norm_stderr\": 0.014551310568143705\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.02513100023364789,\n\ \ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.02513100023364789\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\ \ \"acc_stderr\": 0.01580100372914589,\n \"acc_norm\": 0.33631284916201115,\n\ \ \"acc_norm_stderr\": 0.01580100372914589\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\ \ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\ \ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\ \ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\ \ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \ \ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n\ \ \"acc_stderr\": 0.012680037994097079,\n \"acc_norm\": 0.4406779661016949,\n\ \ \"acc_norm_stderr\": 0.012680037994097079\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \ \ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \ \ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\ \ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\ \ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\ \ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\ \ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\ \ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\ \ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\ \ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\ \ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.555285737063638,\n\ \ \"mc2_stderr\": 0.015637031939929425\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4025777103866566,\n \ \ \"acc_stderr\": 0.013508523063663439\n }\n}\n```" repo_url: https://huggingface.co/gagan3012/Multilingual-mistral leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|arc:challenge|25_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-16T13-00-17.256624.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|gsm8k|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hellaswag|10_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T13-00-17.256624.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T13-00-17.256624.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_16T13_00_17.256624 path: - '**/details_harness|winogrande|5_2024-01-16T13-00-17.256624.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-16T13-00-17.256624.parquet' - config_name: results data_files: - split: 2024_01_16T13_00_17.256624 path: - results_2024-01-16T13-00-17.256624.parquet - split: latest path: - results_2024-01-16T13-00-17.256624.parquet --- # Dataset Card for Evaluation run of gagan3012/Multilingual-mistral <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [gagan3012/Multilingual-mistral](https://huggingface.co/gagan3012/Multilingual-mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_gagan3012__Multilingual-mistral", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-16T13:00:17.256624](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multilingual-mistral/blob/main/results_2024-01-16T13-00-17.256624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6125296970145792, "acc_stderr": 0.032987213837153834, "acc_norm": 0.6173816622782963, "acc_norm_stderr": 0.03365203917647103, "mc1": 0.40024479804161567, "mc1_stderr": 0.017151605555749138, "mc2": 0.555285737063638, "mc2_stderr": 0.015637031939929425 }, "harness|arc:challenge|25": { "acc": 0.5938566552901023, "acc_stderr": 0.014351656690097862, "acc_norm": 0.6228668941979523, "acc_norm_stderr": 0.014163366896192593 }, "harness|hellaswag|10": { "acc": 0.6274646484763992, "acc_stderr": 0.004824917516374184, "acc_norm": 0.8175662218681538, "acc_norm_stderr": 0.00385412337350911 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.044084400227680794, "acc_norm": 0.74, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.046570472605949646, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.046570472605949646 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067877, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067877 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5903225806451613, "acc_stderr": 0.02797605491534735, "acc_norm": 0.5903225806451613, "acc_norm_stderr": 0.02797605491534735 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885415, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.02541634309630644, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.02541634309630644 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5948717948717949, "acc_stderr": 0.024890471769938145, "acc_norm": 0.5948717948717949, "acc_norm_stderr": 0.024890471769938145 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6260504201680672, "acc_stderr": 0.031429466378837076, "acc_norm": 0.6260504201680672, "acc_norm_stderr": 0.031429466378837076 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.39072847682119205, "acc_stderr": 0.039837983066598075, "acc_norm": 0.39072847682119205, "acc_norm_stderr": 0.039837983066598075 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8275229357798165, "acc_stderr": 0.016197807956848023, "acc_norm": 0.8275229357798165, "acc_norm_stderr": 0.016197807956848023 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.034086558679777494, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.034086558679777494 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.029102254389674082, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.029102254389674082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.02730348459906943, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.02730348459906943 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.0318114974705536, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.038448761397852714, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.038448761397852714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7905491698595147, "acc_stderr": 0.014551310568143705, "acc_norm": 0.7905491698595147, "acc_norm_stderr": 0.014551310568143705 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.02513100023364789, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.02513100023364789 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.01580100372914589, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.01580100372914589 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.02617390850671858, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.02617390850671858 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6697530864197531, "acc_stderr": 0.026168298456732846, "acc_norm": 0.6697530864197531, "acc_norm_stderr": 0.026168298456732846 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4406779661016949, "acc_stderr": 0.012680037994097079, "acc_norm": 0.4406779661016949, "acc_norm_stderr": 0.012680037994097079 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.625, "acc_stderr": 0.029408372932278746, "acc_norm": 0.625, "acc_norm_stderr": 0.029408372932278746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6517412935323383, "acc_stderr": 0.033687874661154596, "acc_norm": 0.6517412935323383, "acc_norm_stderr": 0.033687874661154596 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.0389136449583582, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.0389136449583582 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.40024479804161567, "mc1_stderr": 0.017151605555749138, "mc2": 0.555285737063638, "mc2_stderr": 0.015637031939929425 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 }, "harness|gsm8k|5": { "acc": 0.4025777103866566, "acc_stderr": 0.013508523063663439 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
hpprc/alt-parallel-en-ja
--- dataset_info: features: - name: en dtype: string - name: ja dtype: string splits: - name: train num_bytes: 5793389.111731535 num_examples: 18083 - name: validation num_bytes: 312612 num_examples: 1000 - name: test num_bytes: 322051.6633954858 num_examples: 1017 download_size: 3977033 dataset_size: 6428052.775127021 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* license: cc-by-4.0 task_categories: - translation language: - en - ja pretty_name: ALT --- # Asian Language Treebank (ALT) Project [ALT Parallel Corpus](https://www2.nict.go.jp/astrec-att/member/mutiyama/ALT/)のうち、日英対訳部分のみを抽出したデータセットです。 処理元のデータとしては、HuggingFace上の[https://huggingface.co/datasets/alt](https://huggingface.co/datasets/alt)を利用しています。 # Citation ```bibtex @inproceedings{riza2016introduction, title={Introduction of the asian language treebank}, author={Riza, Hammam and Purwoadi, Michael and Uliniansyah, Teduh and Ti, Aw Ai and Aljunied, Sharifah Mahani and Mai, Luong Chi and Thang, Vu Tat and Thai, Nguyen Phuong and Chea, Vichet and Sam, Sethserey and others}, booktitle={2016 Conference of The Oriental Chapter of International Committee for Coordination and Standardization of Speech Databases and Assessment Techniques (O-COCOSDA)}, pages={1--6}, year={2016}, organization={IEEE} } ```
hbXNov/sparse_feedback
--- license: mit --- **Title:** Peering Through Preferences: Unraveling Feedback Acquisition for Aligning Large Language Models\ **Paper Link:** https://arxiv.org/abs/2308.15812 \ **Github Link:** https://github.com/Hritikbansal/sparse_feedback
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c8b07d9e
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 184 num_examples: 10 download_size: 1337 dataset_size: 184 --- # Dataset Card for "c8b07d9e" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
taln-ls2n/inspec
--- annotations_creators: - unknown language_creators: - unknown language: - en license: - unknown multilinguality: - monolingual task_categories: - text-mining - text-generation task_ids: - keyphrase-generation - keyphrase-extraction size_categories: - 1K<n<10K pretty_name: Inspec --- # Inspec Benchmark Dataset for Keyphrase Generation ## About Inspec is a dataset for benchmarking keyphrase extraction and generation models. The dataset is composed of 2,000 abstracts of scientific papers collected from the [Inspec database](https://www.theiet.org/resources/inspec/). Keyphrases were annotated by professional indexers in an uncontrolled setting (that is, not limited to thesaurus entries). Details about the inspec dataset can be found in the original paper [(Hulth, 2003)][hulth-2003]. Reference (indexer-assigned) keyphrases are also categorized under the PRMU (<u>P</u>resent-<u>R</u>eordered-<u>M</u>ixed-<u>U</u>nseen) scheme as proposed in [(Boudin and Gallina, 2021)][boudin-2021]. Text pre-processing (tokenization) is carried out using `spacy` (`en_core_web_sm` model) with a special rule to avoid splitting words with hyphens (e.g. graph-based is kept as one token). Stemming (Porter's stemmer implementation provided in `nltk`) is applied before reference keyphrases are matched against the source text. Details about the process can be found in `prmu.py`. ## Content and statistics The dataset is divided into the following three splits: | Split | # documents | #words | # keyphrases | % Present | % Reordered | % Mixed | % Unseen | | :--------- | ----------: | -----: | -----------: | --------: | ----------: | ------: | -------: | | Train | 1,000 | 141.7 | 9.79 | 78.00 | 9.85 | 6.22 | 5.93 | | Validation | 500 | 132.2 | 9.15 | 77.96 | 9.82 | 6.75 | 5.47 | | Test | 500 | 134.8 | 9.83 | 78.70 | 9.92 | 6.48 | 4.91 | The following data fields are available : - **id**: unique identifier of the document. - **title**: title of the document. - **abstract**: abstract of the document. - **keyphrases**: list of reference keyphrases. - **prmu**: list of <u>P</u>resent-<u>R</u>eordered-<u>M</u>ixed-<u>U</u>nseen categories for reference keyphrases. ## References - (Hulth, 2003) Anette Hulth. 2003. [Improved automatic keyword extraction given more linguistic knowledge](https://aclanthology.org/W03-1028). In Proceedings of the 2003 Conference on Empirical Methods in Natural Language Processing, pages 216-223. - (Boudin and Gallina, 2021) Florian Boudin and Ygor Gallina. 2021. [Redefining Absent Keyphrases and their Effect on Retrieval Effectiveness](https://aclanthology.org/2021.naacl-main.330/). In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4185–4193, Online. Association for Computational Linguistics. [hulth-2003]: https://aclanthology.org/W03-1028/ [boudin-2021]: https://aclanthology.org/2021.naacl-main.330/
mask-distilled-one-sec-cv12/chunk_93
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1258044796 num_examples: 247063 download_size: 1284316210 dataset_size: 1258044796 --- # Dataset Card for "chunk_93" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
relbert/conceptnet_relational_similarity
--- language: - en license: - other multilinguality: - monolingual size_categories: - n<1K pretty_name: ConceptNet with High Confidence --- # Dataset Card for "relbert/conceptnet_relation_similarity" ## Dataset Description - **Repository:** [RelBERT](https://github.com/asahi417/relbert) - **Paper:** [https://home.ttic.edu/~kgimpel/commonsense.html](https://home.ttic.edu/~kgimpel/commonsense.html) - **Dataset:** Relational similarity dataset based on the high-confidence subset of ConceptNet ### Dataset Summary The selected subset of ConceptNet used in [this work](https://home.ttic.edu/~kgimpel/commonsense.html), which compiled to fine-tune [RelBERT](https://github.com/asahi417/relbert) model. We removed `NotCapableOf` and `NotDesires` to keep the positive relation only. We consider the original test set as test set, dev1 as the training set, and dev2 as the validation set. ## Dataset Structure ### Data Instances An example of `train` looks as follows. ```shell { "relation_type": "AtLocation", "positives": [["fish", "water"], ["cloud", "sky"], ["child", "school"], ... ], "negatives": [["pen", "write"], ["sex", "fun"], ["soccer", "sport"], ["fish", "school"], ... ] } ``` ### Data Splits | train |validation| test| |--------:|---------:|---------:| | 28| 34 | 16| ### Citation Information ``` @InProceedings{P16-1137, author = "Li, Xiang and Taheri, Aynaz and Tu, Lifu and Gimpel, Kevin", title = "Commonsense Knowledge Base Completion", booktitle = "Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) ", year = "2016", publisher = "Association for Computational Linguistics", pages = "1445--1455", location = "Berlin, Germany", doi = "10.18653/v1/P16-1137", url = "http://aclweb.org/anthology/P16-1137" } ```
abdullahmeda/yolov8-table-detection
--- size_categories: - 1K<n<10K task_categories: - object-detection language: - en tags: - Table - Unstructured Document - YOLOv8 - Object Detection - Table Detection pretty_name: TableDetectionNet --- # Table Detection in Document Images using YOLOv8 The Table Detection YOLO dataset is a collection of document images annotated with table bounding boxes suitable for \ training object detection models, specifically using the YOLOv8 (You Only Look Once) architecture. The dataset is intended \ for developing and evaluating table detection algorithms within the field of document analysis and recognition. The \ annotations define the locations of tables within a variety of document images, which can range from scanned documents to \ digital PDFs. ### Dataset Labels ```json ['table'] ``` ### Number of Images ```json {"train": 815, "valid": 152, "test": 52} ``` ### Getting Started
1rsh/speech-qa-awadhi-hi-karya
--- dataset_info: features: - name: audio dtype: audio - name: sentence dtype: string splits: - name: train num_bytes: 98824338.03023759 num_examples: 425 - name: test num_bytes: 9268435.969762418 num_examples: 38 download_size: 106714625 dataset_size: 108092774.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0
--- pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-gpt4-2.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-19T00:42:07.460646](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0/blob/main/results_2023-10-19T00-42-07.460646.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3656669463087248,\n\ \ \"em_stderr\": 0.004932205632924282,\n \"f1\": 0.4289702181208073,\n\ \ \"f1_stderr\": 0.00478287167348305,\n \"acc\": 0.4096206676388381,\n\ \ \"acc_stderr\": 0.009827996178597372\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.3656669463087248,\n \"em_stderr\": 0.004932205632924282,\n\ \ \"f1\": 0.4289702181208073,\n \"f1_stderr\": 0.00478287167348305\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \ \ \"acc_stderr\": 0.00735771352322235\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972392\n\ \ }\n}\n```" repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|arc:challenge|25_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-17T16:46:20.305842.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_19T00_42_07.460646 path: - '**/details_harness|drop|3_2023-10-19T00-42-07.460646.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-19T00-42-07.460646.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_19T00_42_07.460646 path: - '**/details_harness|gsm8k|5_2023-10-19T00-42-07.460646.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-19T00-42-07.460646.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hellaswag|10_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_17T16_46_20.305842 path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T16:46:20.305842.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T16:46:20.305842.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_19T00_42_07.460646 path: - '**/details_harness|winogrande|5_2023-10-19T00-42-07.460646.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-19T00-42-07.460646.parquet' - config_name: results data_files: - split: 2023_08_17T16_46_20.305842 path: - results_2023-08-17T16:46:20.305842.parquet - split: 2023_10_19T00_42_07.460646 path: - results_2023-10-19T00-42-07.460646.parquet - split: latest path: - results_2023-10-19T00-42-07.460646.parquet --- # Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-2.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-19T00:42:07.460646](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0/blob/main/results_2023-10-19T00-42-07.460646.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3656669463087248, "em_stderr": 0.004932205632924282, "f1": 0.4289702181208073, "f1_stderr": 0.00478287167348305, "acc": 0.4096206676388381, "acc_stderr": 0.009827996178597372 }, "harness|drop|3": { "em": 0.3656669463087248, "em_stderr": 0.004932205632924282, "f1": 0.4289702181208073, "f1_stderr": 0.00478287167348305 }, "harness|gsm8k|5": { "acc": 0.07733131159969674, "acc_stderr": 0.00735771352322235 }, "harness|winogrande|5": { "acc": 0.7419100236779794, "acc_stderr": 0.012298278833972392 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
tyzhu/rareid_find_last_sent_train_10_eval_10
--- dataset_info: features: - name: inputs dtype: string - name: targets dtype: string - name: title dtype: string - name: context dtype: string splits: - name: train num_bytes: 38215 num_examples: 30 - name: validation num_bytes: 8924 num_examples: 10 download_size: 42371 dataset_size: 47139 --- # Dataset Card for "rareid_find_last_sent_train_10_eval_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
emmanuelr/yelp-review
--- license: other ---
lucadiliello/mnli
--- dataset_info: features: - name: key dtype: string - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 splits: - name: dev_matched num_bytes: 1869989 num_examples: 9815 - name: dev_mismatched num_bytes: 1985345 num_examples: 9832 - name: test_matched num_bytes: 1884664 num_examples: 9796 - name: test_mismatched num_bytes: 1986695 num_examples: 9847 - name: train num_bytes: 76786075 num_examples: 392702 download_size: 54416761 dataset_size: 84512768 --- # Dataset Card for "mnli" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
benayas/banking_artificial_5pct_v0
--- dataset_info: features: - name: text dtype: string - name: category dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 1065292 num_examples: 10003 download_size: 315091 dataset_size: 1065292 configs: - config_name: default data_files: - split: train path: data/train-* ---
averageandyyy/train_librispeech_self
--- dataset_info: features: - name: audio dtype: audio - name: transcript dtype: string splits: - name: train num_bytes: 6992473149.621 num_examples: 28539 download_size: 6432578037 dataset_size: 6992473149.621 --- # Dataset Card for "train_librispeech_self" num_examples: 28539
Ammar-Azman/crawl-mufti-perlis
--- license: mit language: - ms tags: - fatwa - malaysia --- # Details - Source: https://muftiperlis.gov.my/ - Scrap date: 26/08/2023
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_1_t_0.25
--- dataset_info: config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: preference dtype: int64 - name: output_1 dtype: string - name: output_2 dtype: string - name: reward_model_prompt_format dtype: string - name: gen_prompt_format dtype: string - name: gen_kwargs struct: - name: do_sample dtype: bool - name: max_new_tokens dtype: int64 - name: pad_token_id dtype: int64 - name: top_k dtype: int64 - name: top_p dtype: float64 - name: reward_1 dtype: float64 - name: reward_2 dtype: float64 - name: n_samples dtype: int64 - name: reject_select dtype: string - name: index dtype: int64 - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: filtered_epoch dtype: int64 - name: gen_reward dtype: float64 - name: gen_response dtype: string splits: - name: epoch_0 num_bytes: 43626381 num_examples: 18928 - name: epoch_1 num_bytes: 44128763 num_examples: 18928 - name: epoch_2 num_bytes: 44212537 num_examples: 18928 - name: epoch_3 num_bytes: 44243636 num_examples: 18928 - name: epoch_4 num_bytes: 44262345 num_examples: 18928 - name: epoch_5 num_bytes: 44266457 num_examples: 18928 - name: epoch_6 num_bytes: 44263294 num_examples: 18928 - name: epoch_7 num_bytes: 44258156 num_examples: 18928 - name: epoch_8 num_bytes: 44249911 num_examples: 18928 - name: epoch_9 num_bytes: 44245361 num_examples: 18928 - name: epoch_10 num_bytes: 44242776 num_examples: 18928 - name: epoch_11 num_bytes: 44242939 num_examples: 18928 - name: epoch_12 num_bytes: 44242014 num_examples: 18928 - name: epoch_13 num_bytes: 44243353 num_examples: 18928 - name: epoch_14 num_bytes: 44241464 num_examples: 18928 - name: epoch_15 num_bytes: 44242426 num_examples: 18928 - name: epoch_16 num_bytes: 44242676 num_examples: 18928 - name: epoch_17 num_bytes: 44240779 num_examples: 18928 - name: epoch_18 num_bytes: 44239833 num_examples: 18928 - name: epoch_19 num_bytes: 44239070 num_examples: 18928 - name: epoch_20 num_bytes: 44240876 num_examples: 18928 - name: epoch_21 num_bytes: 44241236 num_examples: 18928 - name: epoch_22 num_bytes: 44240341 num_examples: 18928 - name: epoch_23 num_bytes: 44241941 num_examples: 18928 - name: epoch_24 num_bytes: 44239980 num_examples: 18928 - name: epoch_25 num_bytes: 44240578 num_examples: 18928 - name: epoch_26 num_bytes: 44241984 num_examples: 18928 - name: epoch_27 num_bytes: 44241784 num_examples: 18928 - name: epoch_28 num_bytes: 44240348 num_examples: 18928 - name: epoch_29 num_bytes: 44240981 num_examples: 18928 download_size: 679231155 dataset_size: 1326584220 configs: - config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 data_files: - split: epoch_0 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-* - split: epoch_1 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-* - split: epoch_2 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-* - split: epoch_3 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-* - split: epoch_4 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-* - split: epoch_5 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-* - split: epoch_6 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-* - split: epoch_7 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-* - split: epoch_8 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-* - split: epoch_9 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-* - split: epoch_10 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-* - split: epoch_11 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-* - split: epoch_12 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-* - split: epoch_13 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-* - split: epoch_14 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-* - split: epoch_15 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-* - split: epoch_16 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-* - split: epoch_17 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-* - split: epoch_18 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-* - split: epoch_19 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-* - split: epoch_20 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-* - split: epoch_21 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-* - split: epoch_22 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-* - split: epoch_23 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-* - split: epoch_24 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-* - split: epoch_25 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-* - split: epoch_26 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-* - split: epoch_27 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-* - split: epoch_28 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-* - split: epoch_29 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-* ---
davanstrien/amazonian_fish_classifier_data
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': Ancistrus '1': Apistogramma '2': Astyanax '3': Bario '4': Bryconops '5': Bujurquina '6': Bunocephalus '7': Characidium '8': Charax '9': Copella '10': Corydoras '11': Creagrutus '12': Curimata '13': Doras '14': Erythrinus '15': Gasteropelecus '16': Gymnotus '17': Hemigrammus '18': Hyphessobrycon '19': Knodus '20': Moenkhausia '21': Otocinclus '22': Oxyropsis '23': Phenacogaster '24': Pimelodella '25': Prochilodus '26': Pygocentrus '27': Pyrrhulina '28': Rineloricaria '29': Sorubim '30': Tatia '31': Tetragonopterus '32': Tyttocharax splits: - name: train num_bytes: 1068363405 num_examples: 3068 download_size: 330399200 dataset_size: 1068363405 task_categories: - image-classification pretty_name: cc license: cc-by-4.0 tags: - biology - lam size_categories: - 1K<n<10K ---
malaysia-ai/malay-conversational-speech-corpus
--- dataset_info: features: - name: 'Y' dtype: string - name: id dtype: string - name: gender dtype: string - name: filename dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 48785004.736 num_examples: 3241 download_size: 47709555 dataset_size: 48785004.736 language: - ms --- # malay-conversational-speech-corpus Mirror for https://magichub.com/datasets/malay-conversational-speech-corpus/, license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
pvduy/sft_300k
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 685811565 num_examples: 300000 - name: test num_bytes: 5690712 num_examples: 1000 download_size: 353817766 dataset_size: 691502277 --- # Dataset Card for "sft_300k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_SaylorTwift__gpt2_test
--- pretty_name: Evaluation run of SaylorTwift/gpt2_test dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [SaylorTwift/gpt2_test](https://huggingface.co/SaylorTwift/gpt2_test) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SaylorTwift__gpt2_test\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-22T16:48:41.866587](https://huggingface.co/datasets/open-llm-leaderboard/details_SaylorTwift__gpt2_test/blob/main/results_2023-09-22T16-48-41.866587.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n\ \ \"em_stderr\": 0.0005131152834514814,\n \"f1\": 0.04780411073825513,\n\ \ \"f1_stderr\": 0.0013732412097489425,\n \"acc\": 0.25210824971442214,\n\ \ \"acc_stderr\": 0.007783509925876779\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514814,\n\ \ \"f1\": 0.04780411073825513,\n \"f1_stderr\": 0.0013732412097489425\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \ \ \"acc_stderr\": 0.0015145735612245488\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529009\n\ \ }\n}\n```" repo_url: https://huggingface.co/SaylorTwift/gpt2_test leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|arc:challenge|25_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-07-19T19:08:58.298962.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_22T16_48_41.866587 path: - '**/details_harness|drop|3_2023-09-22T16-48-41.866587.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-22T16-48-41.866587.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_22T16_48_41.866587 path: - '**/details_harness|gsm8k|5_2023-09-22T16-48-41.866587.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-22T16-48-41.866587.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hellaswag|10_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_07_19T19_08_58.298962 path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T19:08:58.298962.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T19:08:58.298962.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_22T16_48_41.866587 path: - '**/details_harness|winogrande|5_2023-09-22T16-48-41.866587.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-22T16-48-41.866587.parquet' - config_name: results data_files: - split: 2023_07_19T19_08_58.298962 path: - results_2023-07-19T19:08:58.298962.parquet - split: 2023_09_22T16_48_41.866587 path: - results_2023-09-22T16-48-41.866587.parquet - split: latest path: - results_2023-09-22T16-48-41.866587.parquet --- # Dataset Card for Evaluation run of SaylorTwift/gpt2_test ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/SaylorTwift/gpt2_test - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [SaylorTwift/gpt2_test](https://huggingface.co/SaylorTwift/gpt2_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SaylorTwift__gpt2_test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T16:48:41.866587](https://huggingface.co/datasets/open-llm-leaderboard/details_SaylorTwift__gpt2_test/blob/main/results_2023-09-22T16-48-41.866587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0025167785234899327, "em_stderr": 0.0005131152834514814, "f1": 0.04780411073825513, "f1_stderr": 0.0013732412097489425, "acc": 0.25210824971442214, "acc_stderr": 0.007783509925876779 }, "harness|drop|3": { "em": 0.0025167785234899327, "em_stderr": 0.0005131152834514814, "f1": 0.04780411073825513, "f1_stderr": 0.0013732412097489425 }, "harness|gsm8k|5": { "acc": 0.003032600454890068, "acc_stderr": 0.0015145735612245488 }, "harness|winogrande|5": { "acc": 0.5011838989739542, "acc_stderr": 0.014052446290529009 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
CyberHarem/hiragi_kii_ahogirl
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Hiragi Kii This is the dataset of Hiragi Kii, containing 46 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 46 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 95 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 46 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 46 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 46 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 46 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 46 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 95 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 95 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 95 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
maghwa/OpenHermes-2-AR-10K-36-800k-810k
--- dataset_info: features: - name: topic dtype: 'null' - name: model dtype: 'null' - name: avatarUrl dtype: 'null' - name: source dtype: string - name: hash dtype: 'null' - name: conversations dtype: string - name: title dtype: 'null' - name: model_name dtype: 'null' - name: system_prompt dtype: 'null' - name: views dtype: float64 - name: skip_prompt_formatting dtype: 'null' - name: idx dtype: 'null' - name: category dtype: 'null' - name: id dtype: 'null' - name: language dtype: 'null' - name: custom_instruction dtype: 'null' splits: - name: train num_bytes: 25076153 num_examples: 10001 download_size: 11308805 dataset_size: 25076153 configs: - config_name: default data_files: - split: train path: data/train-* ---
rmanluo/RoG-cwq
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: id dtype: string - name: question dtype: string - name: answer sequence: string - name: q_entity sequence: string - name: a_entity sequence: string - name: graph sequence: sequence: string - name: choices sequence: 'null' splits: - name: train num_bytes: 8890766478 num_examples: 27639 - name: validation num_bytes: 1170336525 num_examples: 3519 - name: test num_bytes: 1208452620 num_examples: 3531 download_size: 1993772283 dataset_size: 11269555623 --- # Dataset Card for "RoG-cwq" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BeIR/dbpedia-entity-generated-queries
--- annotations_creators: [] language_creators: [] language: - en license: - cc-by-sa-4.0 multilinguality: - monolingual paperswithcode_id: beir pretty_name: BEIR Benchmark size_categories: msmarco: - 1M<n<10M trec-covid: - 100k<n<1M nfcorpus: - 1K<n<10K nq: - 1M<n<10M hotpotqa: - 1M<n<10M fiqa: - 10K<n<100K arguana: - 1K<n<10K touche-2020: - 100K<n<1M cqadupstack: - 100K<n<1M quora: - 100K<n<1M dbpedia: - 1M<n<10M scidocs: - 10K<n<100K fever: - 1M<n<10M climate-fever: - 1M<n<10M scifact: - 1K<n<10K source_datasets: [] task_categories: - text-retrieval - zero-shot-retrieval - information-retrieval - zero-shot-information-retrieval task_ids: - passage-retrieval - entity-linking-retrieval - fact-checking-retrieval - tweet-retrieval - citation-prediction-retrieval - duplication-question-retrieval - argument-retrieval - news-retrieval - biomedical-information-retrieval - question-answering-retrieval --- # Dataset Card for BEIR Benchmark ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://github.com/UKPLab/beir - **Repository:** https://github.com/UKPLab/beir - **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ - **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns - **Point of Contact:** nandan.thakur@uwaterloo.ca ### Dataset Summary BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks: - Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact) - Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/) - Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) - News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html) - Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data) - Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) - Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs) - Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html) - Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/) All these datasets have been preprocessed and can be used for your experiments. ```python ``` ### Supported Tasks and Leaderboards The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia. The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/). ### Languages All tasks are in English (`en`). ## Dataset Structure All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format: - `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}` - `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}` - `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1` ### Data Instances A high level example of any beir dataset: ```python corpus = { "doc1" : { "title": "Albert Einstein", "text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \ one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \ its influence on the philosophy of science. He is best known to the general public for his mass–energy \ equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \ Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \ of the photoelectric effect', a pivotal step in the development of quantum theory." }, "doc2" : { "title": "", # Keep title an empty string if not present "text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \ malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\ with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)." }, } queries = { "q1" : "Who developed the mass-energy equivalence formula?", "q2" : "Which beer is brewed with a large proportion of wheat?" } qrels = { "q1" : {"doc1": 1}, "q2" : {"doc2": 1}, } ``` ### Data Fields Examples from all configurations have the following features: ### Corpus - `corpus`: a `dict` feature representing the document title and passage text, made up of: - `_id`: a `string` feature representing the unique document id - `title`: a `string` feature, denoting the title of the document. - `text`: a `string` feature, denoting the text of the document. ### Queries - `queries`: a `dict` feature representing the query, made up of: - `_id`: a `string` feature representing the unique query id - `text`: a `string` feature, denoting the text of the query. ### Qrels - `qrels`: a `dict` feature representing the query document relevance judgements, made up of: - `_id`: a `string` feature representing the query id - `_id`: a `string` feature, denoting the document id. - `score`: a `int32` feature, denoting the relevance judgement between query and document. ### Data Splits | Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 | | -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:| | MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` | | TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` | | NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` | | BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) | | NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` | | HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` | | FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` | | Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) | | TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) | | ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` | | Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` | | CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` | | Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` | | DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` | | SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` | | FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` | | Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` | | SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` | | Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information [Needs More Information] ### Citation Information Cite as: ``` @inproceedings{ thakur2021beir, title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models}, author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych}, booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)}, year={2021}, url={https://openreview.net/forum?id=wCu6T5xFjeJ} } ``` ### Contributions Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.
yashnbx/gita_supersite_dump
--- dataset_info: features: - name: shloka_id dtype: string - name: chapter dtype: string - name: sutra dtype: string - name: trans-htrskd dtype: string description: Hindi Translation By Swami Ramsukhdas - name: trans-httyn dtype: string description: Hindi Translation By Swami Tejomayananda - name: trans-hcchi dtype: string description: Hindi Commentary By Swami Chinmayananda - name: trans-hcrskd dtype: string description: Hindi Commentary By Swami Ramsukhdas - name: trans-scang dtype: string description: Sanskrit Commentary By Sri Abhinavgupta - name: trans-scram dtype: string description: Sanskrit Commentary By Sri Ramanujacharya - name: trans-scanand dtype: string description: Sanskrit Commentary By Sri Anandgiri - name: trans-scval dtype: string description: Sanskrit Commentary By Sri Vallabhacharya - name: trans-scms dtype: string description: Sanskrit Commentary By Sri Madhusudan Saraswati - name: trans-scsri dtype: string description: Sanskrit Commentary By Sri Sridhara Swami - name: trans-scvv dtype: string description: Sanskrit Commentary By Sri Vedantadeshikacharya Venkatanatha - name: trans-scpur dtype: string description: Sanskrit Commentary By Sri Purushottamji - name: trans-scneel dtype: string description: Sanskrit Commentary By Sri Neelkanth - name: trans-scdhan dtype: string description: Sanskrit Commentary By Sri Dhanpati - name: trans-ecsiva dtype: string description: English Commentary By Swami Sivananda - name: trans-etsiva dtype: string description: English Translation By Swami Sivananda - name: trans-etpurohit dtype: string description: English Translation By Purohit Swami - name: trans-etgb dtype: string description: English Translation By Swami Gambirananda - name: trans-setgb dtype: string description: English Translation Of Sri Shankaracharya By Swami Gambirananda - name: trans-etssa dtype: string description: English Translation By Dr. S. Sankaranarayan - name: trans-etassa dtype: string description: English Translation of Abhinavgupta's Sanskrit Commentary By Dr. S. Sankaranarayan - name: trans-etradi dtype: string description: English Translation of Ramanujacharya's Sanskrit Commentary By Swami Adidevananda - name: trans-etadi dtype: string description: English Translation By Swami Adidevananda - name: trans-htshg dtype: string description: Hindi Translation Of Sri Shankaracharya's Sanskrit Commentary By Sri Harikrishnadas Goenka - name: trans-scsh dtype: string description: Sanskrit Commentary By Sri Shankaracharya - name: trans-scjaya dtype: string description: Sanskrit Commentary By Sri Jayatirtha - name: trans-scmad dtype: string description: Sanskrit Commentary By Sri Madhvacharya - name: script-dv dtype: string description: Devanagari - name: script-as dtype: string description: Assamese - name: script-bn dtype: string description: Bengali - name: script-gu dtype: string description: Gujarati - name: script-pa dtype: string description: Gurmukhi - name: script-kn dtype: string description: Kannada - name: script-ml dtype: string description: Malayalam - name: script-or dtype: string description: Odia - name: script-ro dtype: string description: Roman - name: script-ta dtype: string description: Tamil - name: script-te dtype: string description: Telugu splits: - name: train num_bytes: 31628579 num_examples: 701 download_size: 11660830 dataset_size: 31628579 size_categories: - n<1K --- # Dataset Card for "gita_supersite_dump" Extracted from: [gitasupersite.iitk](https://www.gitasupersite.iitk.ac.in/) To recreate checkout [this notebook](./dump.ipynb) Translation column names: - `htrskd` - Hindi Translation By Swami Ramsukhdas - `httyn` - Hindi Translation By Swami Tejomayananda - `htshg` - Hindi Translation Of Sri Shankaracharya's Sanskrit Commentary By Sri Harikrishnadas Goenka - `scsh` - Sanskrit Commentary By Sri Shankaracharya - `hcchi` - Hindi Commentary By Swami Chinmayananda - `hcrskd` - Hindi Commentary By Swami Ramsukhdas - `scang` - Sanskrit Commentary By Sri Abhinavgupta - `scram` - Sanskrit Commentary By Sri Ramanujacharya - `scanand` - Sanskrit Commentary By Sri Anandgiri - `scjaya` - Sanskrit Commentary By Sri Jayatirtha - `scmad` - Sanskrit Commentary By Sri Madhvacharya - `scval` - Sanskrit Commentary By Sri Vallabhacharya - `scms` - Sanskrit Commentary By Sri Madhusudan Saraswati - `scsri` - Sanskrit Commentary By Sri Sridhara Swami - `scvv` - Sanskrit Commentary By Sri Vedantadeshikacharya Venkatanatha - `scpur` - Sanskrit Commentary By Sri Purushottamji - `scneel` - Sanskrit Commentary By Sri Neelkanth - `scdhan` - Sanskrit Commentary By Sri Dhanpati - `ecsiva` - English Commentary By Swami Sivananda - `etsiva` - English Translation By Swami Sivananda - `etpurohit` - English Translation By Purohit Swami - `etgb` - English Translation By Swami Gambirananda - `setgb` - English Translation Of Sri Shankaracharya By Swami Gambirananda - `etssa` - English Translation By Dr. S. Sankaranarayan - `etassa` - English Translation of Abhinavgupta's Sanskrit Commentary By Dr. S. Sankaranarayan - `etradi` - English Translation of Ramanujacharya's Sanskrit Commentary By Swami Adidevananda - `etadi` - English Translation By Swami Adidevananda Script column names: - `dv` - "Devanagari" - `as` - "Assamese" - `bn` - "Bengali" - `gu` - "Gujarati" - `pa` - "Gurmukhi" - `kn` - "Kannada" - `ml` - "Malayalam" - `or` - "Odia" - `ro` - "Roman" - `ta` - "Tamil" - `te` - "Telugu"
Gabriel/citesum_swe
--- language: - sv license: - cc-by-nc-4.0 size_categories: - 10K<n<100K source_datasets: - https://github.com/morningmoni/CiteSu task_categories: - summarization - text2text-generation task_ids: [] tags: - conditional-text-generation --- # Dataset Card for Swedish Citesum Dataset The Swedish citesum dataset has only been machine-translated to improve downstream fine-tuning on Swedish summarization tasks. ## Dataset Summary Read about the full details at original English version: https://huggingface.co/datasets/citesum ### Paper https://arxiv.org/abs/2205.06207 ### Authors Yuning Mao, Ming Zhong, Jiawei Han University of Illinois Urbana-Champaign {yuningm2, mingz5, hanj}@illinois.edu ## Data details - src (string): source text. long description of paper - tgt (string): target text. tldr of paper - paper_id (string): unique id for the paper - title (string): title of the paper - discipline (dict): - venue (string): Where the paper was published (conference) - journal (string): Journal in which the paper was published - mag_field_of_study (list[str]): scientific fields that the paper falls under. ### Data Splits The Swedish xsum dataset follows the same splits as the original English version and has 3 splits: _train_, _validation_, and _test_. | Dataset Split | Number of Instances in Split | | ------------- | ------------------------------------------- | | Train | 83,304 | | Validation | 4,721 | | Test | 4,921 |
open-llm-leaderboard/details_ResplendentAI__Datura_7B
--- pretty_name: Evaluation run of ResplendentAI/Datura_7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ResplendentAI__Datura_7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-09T18:45:55.161838](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Datura_7B/blob/main/results_2024-03-09T18-45-55.161838.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64722982592556,\n\ \ \"acc_stderr\": 0.03230063546213469,\n \"acc_norm\": 0.6468974706224688,\n\ \ \"acc_norm_stderr\": 0.03297483961998492,\n \"mc1\": 0.5520195838433293,\n\ \ \"mc1_stderr\": 0.017408513063422913,\n \"mc2\": 0.7102687357878247,\n\ \ \"mc2_stderr\": 0.014983390722269722\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957009,\n\ \ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601327\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.716988647679745,\n\ \ \"acc_stderr\": 0.0044954128683246065,\n \"acc_norm\": 0.882692690699064,\n\ \ \"acc_norm_stderr\": 0.003211284760701654\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\ \ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\ \ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\ : 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\ \ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\ \ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\ \ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\ \ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\ \ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\ \ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\ \ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\ \ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\ \ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538804,\n \"\ acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538804\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n\ \ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\ acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\ acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \ \ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\ \ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\ \ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\ \ \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n\ \ \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\ \ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\ \ \"acc_stderr\": 0.01644283065471554,\n \"acc_norm\": 0.40893854748603353,\n\ \ \"acc_norm_stderr\": 0.01644283065471554\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\ \ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\ \ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\ \ \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n\ \ \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \ \ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\ \ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\ \ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\ \ \"mc1_stderr\": 0.017408513063422913,\n \"mc2\": 0.7102687357878247,\n\ \ \"mc2_stderr\": 0.014983390722269722\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6557998483699773,\n \ \ \"acc_stderr\": 0.013086800426693782\n }\n}\n```" repo_url: https://huggingface.co/ResplendentAI/Datura_7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|arc:challenge|25_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-09T18-45-55.161838.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|gsm8k|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hellaswag|10_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-45-55.161838.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-45-55.161838.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T18-45-55.161838.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_09T18_45_55.161838 path: - '**/details_harness|winogrande|5_2024-03-09T18-45-55.161838.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-09T18-45-55.161838.parquet' - config_name: results data_files: - split: 2024_03_09T18_45_55.161838 path: - results_2024-03-09T18-45-55.161838.parquet - split: latest path: - results_2024-03-09T18-45-55.161838.parquet --- # Dataset Card for Evaluation run of ResplendentAI/Datura_7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ResplendentAI__Datura_7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-09T18:45:55.161838](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Datura_7B/blob/main/results_2024-03-09T18-45-55.161838.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.64722982592556, "acc_stderr": 0.03230063546213469, "acc_norm": 0.6468974706224688, "acc_norm_stderr": 0.03297483961998492, "mc1": 0.5520195838433293, "mc1_stderr": 0.017408513063422913, "mc2": 0.7102687357878247, "mc2_stderr": 0.014983390722269722 }, "harness|arc:challenge|25": { "acc": 0.6996587030716723, "acc_stderr": 0.013395909309957009, "acc_norm": 0.7209897610921502, "acc_norm_stderr": 0.013106784883601327 }, "harness|hellaswag|10": { "acc": 0.716988647679745, "acc_stderr": 0.0044954128683246065, "acc_norm": 0.882692690699064, "acc_norm_stderr": 0.003211284760701654 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.042446332383532265, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.042446332383532265 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009181, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009181 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8181818181818182, "acc_stderr": 0.027479603010538804, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.027479603010538804 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971128, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971128 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8311926605504587, "acc_stderr": 0.016060056268530343, "acc_norm": 0.8311926605504587, "acc_norm_stderr": 0.016060056268530343 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.026558372502661916, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.026558372502661916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594654, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.038935425188248475, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.038935425188248475 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8148148148148148, "acc_stderr": 0.013890862162876163, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.013890862162876163 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40893854748603353, "acc_stderr": 0.01644283065471554, "acc_norm": 0.40893854748603353, "acc_norm_stderr": 0.01644283065471554 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818767, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818767 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.02438366553103545, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.02438366553103545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.012739711554045704, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.012739711554045704 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6552287581699346, "acc_stderr": 0.019228322018696647, "acc_norm": 0.6552287581699346, "acc_norm_stderr": 0.019228322018696647 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291296, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.5520195838433293, "mc1_stderr": 0.017408513063422913, "mc2": 0.7102687357878247, "mc2_stderr": 0.014983390722269722 }, "harness|winogrande|5": { "acc": 0.8453038674033149, "acc_stderr": 0.010163172650433537 }, "harness|gsm8k|5": { "acc": 0.6557998483699773, "acc_stderr": 0.013086800426693782 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
bill141477/b1ue
--- license: mit ---
karmiq/glove
--- license: pddl language: - en dataset_info: description: >- Pre-trained word vectors with 50 dimensions for GloVe: Global Vectors for Word Representation homepage: https://nlp.stanford.edu/projects/glove/ license: pddl features: - name: word dtype: string - name: embeddings sequence: float64 --- ## Pre-trained vectors from GloVe: Global Vectors for Word Representation The 50-dimensional embeddings from <https://nlp.stanford.edu/projects/glove/>.
danjacobellis/imagenet_dino
--- dataset_info: features: - name: label dtype: class_label: names: '0': tench, Tinca tinca '1': goldfish, Carassius auratus '2': great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias '3': tiger shark, Galeocerdo cuvieri '4': hammerhead, hammerhead shark '5': electric ray, crampfish, numbfish, torpedo '6': stingray '7': cock '8': hen '9': ostrich, Struthio camelus '10': brambling, Fringilla montifringilla '11': goldfinch, Carduelis carduelis '12': house finch, linnet, Carpodacus mexicanus '13': junco, snowbird '14': indigo bunting, indigo finch, indigo bird, Passerina cyanea '15': robin, American robin, Turdus migratorius '16': bulbul '17': jay '18': magpie '19': chickadee '20': water ouzel, dipper '21': kite '22': bald eagle, American eagle, Haliaeetus leucocephalus '23': vulture '24': great grey owl, great gray owl, Strix nebulosa '25': European fire salamander, Salamandra salamandra '26': common newt, Triturus vulgaris '27': eft '28': spotted salamander, Ambystoma maculatum '29': axolotl, mud puppy, Ambystoma mexicanum '30': bullfrog, Rana catesbeiana '31': tree frog, tree-frog '32': tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui '33': loggerhead, loggerhead turtle, Caretta caretta '34': leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea '35': mud turtle '36': terrapin '37': box turtle, box tortoise '38': banded gecko '39': common iguana, iguana, Iguana iguana '40': American chameleon, anole, Anolis carolinensis '41': whiptail, whiptail lizard '42': agama '43': frilled lizard, Chlamydosaurus kingi '44': alligator lizard '45': Gila monster, Heloderma suspectum '46': green lizard, Lacerta viridis '47': African chameleon, Chamaeleo chamaeleon '48': Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis '49': African crocodile, Nile crocodile, Crocodylus niloticus '50': American alligator, Alligator mississipiensis '51': triceratops '52': thunder snake, worm snake, Carphophis amoenus '53': ringneck snake, ring-necked snake, ring snake '54': hognose snake, puff adder, sand viper '55': green snake, grass snake '56': king snake, kingsnake '57': garter snake, grass snake '58': water snake '59': vine snake '60': night snake, Hypsiglena torquata '61': boa constrictor, Constrictor constrictor '62': rock python, rock snake, Python sebae '63': Indian cobra, Naja naja '64': green mamba '65': sea snake '66': horned viper, cerastes, sand viper, horned asp, Cerastes cornutus '67': diamondback, diamondback rattlesnake, Crotalus adamanteus '68': sidewinder, horned rattlesnake, Crotalus cerastes '69': trilobite '70': harvestman, daddy longlegs, Phalangium opilio '71': scorpion '72': black and gold garden spider, Argiope aurantia '73': barn spider, Araneus cavaticus '74': garden spider, Aranea diademata '75': black widow, Latrodectus mactans '76': tarantula '77': wolf spider, hunting spider '78': tick '79': centipede '80': black grouse '81': ptarmigan '82': ruffed grouse, partridge, Bonasa umbellus '83': prairie chicken, prairie grouse, prairie fowl '84': peacock '85': quail '86': partridge '87': African grey, African gray, Psittacus erithacus '88': macaw '89': sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita '90': lorikeet '91': coucal '92': bee eater '93': hornbill '94': hummingbird '95': jacamar '96': toucan '97': drake '98': red-breasted merganser, Mergus serrator '99': goose '100': black swan, Cygnus atratus '101': tusker '102': echidna, spiny anteater, anteater '103': platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus '104': wallaby, brush kangaroo '105': koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus '106': wombat '107': jellyfish '108': sea anemone, anemone '109': brain coral '110': flatworm, platyhelminth '111': nematode, nematode worm, roundworm '112': conch '113': snail '114': slug '115': sea slug, nudibranch '116': chiton, coat-of-mail shell, sea cradle, polyplacophore '117': chambered nautilus, pearly nautilus, nautilus '118': Dungeness crab, Cancer magister '119': rock crab, Cancer irroratus '120': fiddler crab '121': king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica '122': American lobster, Northern lobster, Maine lobster, Homarus americanus '123': spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish '124': crayfish, crawfish, crawdad, crawdaddy '125': hermit crab '126': isopod '127': white stork, Ciconia ciconia '128': black stork, Ciconia nigra '129': spoonbill '130': flamingo '131': little blue heron, Egretta caerulea '132': American egret, great white heron, Egretta albus '133': bittern '134': crane '135': limpkin, Aramus pictus '136': European gallinule, Porphyrio porphyrio '137': American coot, marsh hen, mud hen, water hen, Fulica americana '138': bustard '139': ruddy turnstone, Arenaria interpres '140': red-backed sandpiper, dunlin, Erolia alpina '141': redshank, Tringa totanus '142': dowitcher '143': oystercatcher, oyster catcher '144': pelican '145': king penguin, Aptenodytes patagonica '146': albatross, mollymawk '147': grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus '148': killer whale, killer, orca, grampus, sea wolf, Orcinus orca '149': dugong, Dugong dugon '150': sea lion '151': Chihuahua '152': Japanese spaniel '153': Maltese dog, Maltese terrier, Maltese '154': Pekinese, Pekingese, Peke '155': Shih-Tzu '156': Blenheim spaniel '157': papillon '158': toy terrier '159': Rhodesian ridgeback '160': Afghan hound, Afghan '161': basset, basset hound '162': beagle '163': bloodhound, sleuthhound '164': bluetick '165': black-and-tan coonhound '166': Walker hound, Walker foxhound '167': English foxhound '168': redbone '169': borzoi, Russian wolfhound '170': Irish wolfhound '171': Italian greyhound '172': whippet '173': Ibizan hound, Ibizan Podenco '174': Norwegian elkhound, elkhound '175': otterhound, otter hound '176': Saluki, gazelle hound '177': Scottish deerhound, deerhound '178': Weimaraner '179': Staffordshire bullterrier, Staffordshire bull terrier '180': American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier '181': Bedlington terrier '182': Border terrier '183': Kerry blue terrier '184': Irish terrier '185': Norfolk terrier '186': Norwich terrier '187': Yorkshire terrier '188': wire-haired fox terrier '189': Lakeland terrier '190': Sealyham terrier, Sealyham '191': Airedale, Airedale terrier '192': cairn, cairn terrier '193': Australian terrier '194': Dandie Dinmont, Dandie Dinmont terrier '195': Boston bull, Boston terrier '196': miniature schnauzer '197': giant schnauzer '198': standard schnauzer '199': Scotch terrier, Scottish terrier, Scottie '200': Tibetan terrier, chrysanthemum dog '201': silky terrier, Sydney silky '202': soft-coated wheaten terrier '203': West Highland white terrier '204': Lhasa, Lhasa apso '205': flat-coated retriever '206': curly-coated retriever '207': golden retriever '208': Labrador retriever '209': Chesapeake Bay retriever '210': German short-haired pointer '211': vizsla, Hungarian pointer '212': English setter '213': Irish setter, red setter '214': Gordon setter '215': Brittany spaniel '216': clumber, clumber spaniel '217': English springer, English springer spaniel '218': Welsh springer spaniel '219': cocker spaniel, English cocker spaniel, cocker '220': Sussex spaniel '221': Irish water spaniel '222': kuvasz '223': schipperke '224': groenendael '225': malinois '226': briard '227': kelpie '228': komondor '229': Old English sheepdog, bobtail '230': Shetland sheepdog, Shetland sheep dog, Shetland '231': collie '232': Border collie '233': Bouvier des Flandres, Bouviers des Flandres '234': Rottweiler '235': German shepherd, German shepherd dog, German police dog, alsatian '236': Doberman, Doberman pinscher '237': miniature pinscher '238': Greater Swiss Mountain dog '239': Bernese mountain dog '240': Appenzeller '241': EntleBucher '242': boxer '243': bull mastiff '244': Tibetan mastiff '245': French bulldog '246': Great Dane '247': Saint Bernard, St Bernard '248': Eskimo dog, husky '249': malamute, malemute, Alaskan malamute '250': Siberian husky '251': dalmatian, coach dog, carriage dog '252': affenpinscher, monkey pinscher, monkey dog '253': basenji '254': pug, pug-dog '255': Leonberg '256': Newfoundland, Newfoundland dog '257': Great Pyrenees '258': Samoyed, Samoyede '259': Pomeranian '260': chow, chow chow '261': keeshond '262': Brabancon griffon '263': Pembroke, Pembroke Welsh corgi '264': Cardigan, Cardigan Welsh corgi '265': toy poodle '266': miniature poodle '267': standard poodle '268': Mexican hairless '269': timber wolf, grey wolf, gray wolf, Canis lupus '270': white wolf, Arctic wolf, Canis lupus tundrarum '271': red wolf, maned wolf, Canis rufus, Canis niger '272': coyote, prairie wolf, brush wolf, Canis latrans '273': dingo, warrigal, warragal, Canis dingo '274': dhole, Cuon alpinus '275': African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus '276': hyena, hyaena '277': red fox, Vulpes vulpes '278': kit fox, Vulpes macrotis '279': Arctic fox, white fox, Alopex lagopus '280': grey fox, gray fox, Urocyon cinereoargenteus '281': tabby, tabby cat '282': tiger cat '283': Persian cat '284': Siamese cat, Siamese '285': Egyptian cat '286': cougar, puma, catamount, mountain lion, painter, panther, Felis concolor '287': lynx, catamount '288': leopard, Panthera pardus '289': snow leopard, ounce, Panthera uncia '290': jaguar, panther, Panthera onca, Felis onca '291': lion, king of beasts, Panthera leo '292': tiger, Panthera tigris '293': cheetah, chetah, Acinonyx jubatus '294': brown bear, bruin, Ursus arctos '295': American black bear, black bear, Ursus americanus, Euarctos americanus '296': ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus '297': sloth bear, Melursus ursinus, Ursus ursinus '298': mongoose '299': meerkat, mierkat '300': tiger beetle '301': ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle '302': ground beetle, carabid beetle '303': long-horned beetle, longicorn, longicorn beetle '304': leaf beetle, chrysomelid '305': dung beetle '306': rhinoceros beetle '307': weevil '308': fly '309': bee '310': ant, emmet, pismire '311': grasshopper, hopper '312': cricket '313': walking stick, walkingstick, stick insect '314': cockroach, roach '315': mantis, mantid '316': cicada, cicala '317': leafhopper '318': lacewing, lacewing fly '319': dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk '320': damselfly '321': admiral '322': ringlet, ringlet butterfly '323': monarch, monarch butterfly, milkweed butterfly, Danaus plexippus '324': cabbage butterfly '325': sulphur butterfly, sulfur butterfly '326': lycaenid, lycaenid butterfly '327': starfish, sea star '328': sea urchin '329': sea cucumber, holothurian '330': wood rabbit, cottontail, cottontail rabbit '331': hare '332': Angora, Angora rabbit '333': hamster '334': porcupine, hedgehog '335': fox squirrel, eastern fox squirrel, Sciurus niger '336': marmot '337': beaver '338': guinea pig, Cavia cobaya '339': sorrel '340': zebra '341': hog, pig, grunter, squealer, Sus scrofa '342': wild boar, boar, Sus scrofa '343': warthog '344': hippopotamus, hippo, river horse, Hippopotamus amphibius '345': ox '346': water buffalo, water ox, Asiatic buffalo, Bubalus bubalis '347': bison '348': ram, tup '349': bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis '350': ibex, Capra ibex '351': hartebeest '352': impala, Aepyceros melampus '353': gazelle '354': Arabian camel, dromedary, Camelus dromedarius '355': llama '356': weasel '357': mink '358': polecat, fitch, foulmart, foumart, Mustela putorius '359': black-footed ferret, ferret, Mustela nigripes '360': otter '361': skunk, polecat, wood pussy '362': badger '363': armadillo '364': three-toed sloth, ai, Bradypus tridactylus '365': orangutan, orang, orangutang, Pongo pygmaeus '366': gorilla, Gorilla gorilla '367': chimpanzee, chimp, Pan troglodytes '368': gibbon, Hylobates lar '369': siamang, Hylobates syndactylus, Symphalangus syndactylus '370': guenon, guenon monkey '371': patas, hussar monkey, Erythrocebus patas '372': baboon '373': macaque '374': langur '375': colobus, colobus monkey '376': proboscis monkey, Nasalis larvatus '377': marmoset '378': capuchin, ringtail, Cebus capucinus '379': howler monkey, howler '380': titi, titi monkey '381': spider monkey, Ateles geoffroyi '382': squirrel monkey, Saimiri sciureus '383': Madagascar cat, ring-tailed lemur, Lemur catta '384': indri, indris, Indri indri, Indri brevicaudatus '385': Indian elephant, Elephas maximus '386': African elephant, Loxodonta africana '387': lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens '388': giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca '389': barracouta, snoek '390': eel '391': coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch '392': rock beauty, Holocanthus tricolor '393': anemone fish '394': sturgeon '395': gar, garfish, garpike, billfish, Lepisosteus osseus '396': lionfish '397': puffer, pufferfish, blowfish, globefish '398': abacus '399': abaya '400': academic gown, academic robe, judge's robe '401': accordion, piano accordion, squeeze box '402': acoustic guitar '403': aircraft carrier, carrier, flattop, attack aircraft carrier '404': airliner '405': airship, dirigible '406': altar '407': ambulance '408': amphibian, amphibious vehicle '409': analog clock '410': apiary, bee house '411': apron '412': ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin '413': assault rifle, assault gun '414': backpack, back pack, knapsack, packsack, rucksack, haversack '415': bakery, bakeshop, bakehouse '416': balance beam, beam '417': balloon '418': ballpoint, ballpoint pen, ballpen, Biro '419': Band Aid '420': banjo '421': bannister, banister, balustrade, balusters, handrail '422': barbell '423': barber chair '424': barbershop '425': barn '426': barometer '427': barrel, cask '428': barrow, garden cart, lawn cart, wheelbarrow '429': baseball '430': basketball '431': bassinet '432': bassoon '433': bathing cap, swimming cap '434': bath towel '435': bathtub, bathing tub, bath, tub '436': beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon '437': beacon, lighthouse, beacon light, pharos '438': beaker '439': bearskin, busby, shako '440': beer bottle '441': beer glass '442': bell cote, bell cot '443': bib '444': bicycle-built-for-two, tandem bicycle, tandem '445': bikini, two-piece '446': binder, ring-binder '447': binoculars, field glasses, opera glasses '448': birdhouse '449': boathouse '450': bobsled, bobsleigh, bob '451': bolo tie, bolo, bola tie, bola '452': bonnet, poke bonnet '453': bookcase '454': bookshop, bookstore, bookstall '455': bottlecap '456': bow '457': bow tie, bow-tie, bowtie '458': brass, memorial tablet, plaque '459': brassiere, bra, bandeau '460': breakwater, groin, groyne, mole, bulwark, seawall, jetty '461': breastplate, aegis, egis '462': broom '463': bucket, pail '464': buckle '465': bulletproof vest '466': bullet train, bullet '467': butcher shop, meat market '468': cab, hack, taxi, taxicab '469': caldron, cauldron '470': candle, taper, wax light '471': cannon '472': canoe '473': can opener, tin opener '474': cardigan '475': car mirror '476': carousel, carrousel, merry-go-round, roundabout, whirligig '477': carpenter's kit, tool kit '478': carton '479': car wheel '480': cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM '481': cassette '482': cassette player '483': castle '484': catamaran '485': CD player '486': cello, violoncello '487': cellular telephone, cellular phone, cellphone, cell, mobile phone '488': chain '489': chainlink fence '490': chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour '491': chain saw, chainsaw '492': chest '493': chiffonier, commode '494': chime, bell, gong '495': china cabinet, china closet '496': Christmas stocking '497': church, church building '498': cinema, movie theater, movie theatre, movie house, picture palace '499': cleaver, meat cleaver, chopper '500': cliff dwelling '501': cloak '502': clog, geta, patten, sabot '503': cocktail shaker '504': coffee mug '505': coffeepot '506': coil, spiral, volute, whorl, helix '507': combination lock '508': computer keyboard, keypad '509': confectionery, confectionary, candy store '510': container ship, containership, container vessel '511': convertible '512': corkscrew, bottle screw '513': cornet, horn, trumpet, trump '514': cowboy boot '515': cowboy hat, ten-gallon hat '516': cradle '517': crane2 '518': crash helmet '519': crate '520': crib, cot '521': Crock Pot '522': croquet ball '523': crutch '524': cuirass '525': dam, dike, dyke '526': desk '527': desktop computer '528': dial telephone, dial phone '529': diaper, nappy, napkin '530': digital clock '531': digital watch '532': dining table, board '533': dishrag, dishcloth '534': dishwasher, dish washer, dishwashing machine '535': disk brake, disc brake '536': dock, dockage, docking facility '537': dogsled, dog sled, dog sleigh '538': dome '539': doormat, welcome mat '540': drilling platform, offshore rig '541': drum, membranophone, tympan '542': drumstick '543': dumbbell '544': Dutch oven '545': electric fan, blower '546': electric guitar '547': electric locomotive '548': entertainment center '549': envelope '550': espresso maker '551': face powder '552': feather boa, boa '553': file, file cabinet, filing cabinet '554': fireboat '555': fire engine, fire truck '556': fire screen, fireguard '557': flagpole, flagstaff '558': flute, transverse flute '559': folding chair '560': football helmet '561': forklift '562': fountain '563': fountain pen '564': four-poster '565': freight car '566': French horn, horn '567': frying pan, frypan, skillet '568': fur coat '569': garbage truck, dustcart '570': gasmask, respirator, gas helmet '571': gas pump, gasoline pump, petrol pump, island dispenser '572': goblet '573': go-kart '574': golf ball '575': golfcart, golf cart '576': gondola '577': gong, tam-tam '578': gown '579': grand piano, grand '580': greenhouse, nursery, glasshouse '581': grille, radiator grille '582': grocery store, grocery, food market, market '583': guillotine '584': hair slide '585': hair spray '586': half track '587': hammer '588': hamper '589': hand blower, blow dryer, blow drier, hair dryer, hair drier '590': hand-held computer, hand-held microcomputer '591': handkerchief, hankie, hanky, hankey '592': hard disc, hard disk, fixed disk '593': harmonica, mouth organ, harp, mouth harp '594': harp '595': harvester, reaper '596': hatchet '597': holster '598': home theater, home theatre '599': honeycomb '600': hook, claw '601': hoopskirt, crinoline '602': horizontal bar, high bar '603': horse cart, horse-cart '604': hourglass '605': iPod '606': iron, smoothing iron '607': jack-o'-lantern '608': jean, blue jean, denim '609': jeep, landrover '610': jersey, T-shirt, tee shirt '611': jigsaw puzzle '612': jinrikisha, ricksha, rickshaw '613': joystick '614': kimono '615': knee pad '616': knot '617': lab coat, laboratory coat '618': ladle '619': lampshade, lamp shade '620': laptop, laptop computer '621': lawn mower, mower '622': lens cap, lens cover '623': letter opener, paper knife, paperknife '624': library '625': lifeboat '626': lighter, light, igniter, ignitor '627': limousine, limo '628': liner, ocean liner '629': lipstick, lip rouge '630': Loafer '631': lotion '632': loudspeaker, speaker, speaker unit, loudspeaker system, speaker system '633': loupe, jeweler's loupe '634': lumbermill, sawmill '635': magnetic compass '636': mailbag, postbag '637': mailbox, letter box '638': maillot '639': maillot, tank suit '640': manhole cover '641': maraca '642': marimba, xylophone '643': mask '644': matchstick '645': maypole '646': maze, labyrinth '647': measuring cup '648': medicine chest, medicine cabinet '649': megalith, megalithic structure '650': microphone, mike '651': microwave, microwave oven '652': military uniform '653': milk can '654': minibus '655': miniskirt, mini '656': minivan '657': missile '658': mitten '659': mixing bowl '660': mobile home, manufactured home '661': Model T '662': modem '663': monastery '664': monitor '665': moped '666': mortar '667': mortarboard '668': mosque '669': mosquito net '670': motor scooter, scooter '671': mountain bike, all-terrain bike, off-roader '672': mountain tent '673': mouse, computer mouse '674': mousetrap '675': moving van '676': muzzle '677': nail '678': neck brace '679': necklace '680': nipple '681': notebook, notebook computer '682': obelisk '683': oboe, hautboy, hautbois '684': ocarina, sweet potato '685': odometer, hodometer, mileometer, milometer '686': oil filter '687': organ, pipe organ '688': oscilloscope, scope, cathode-ray oscilloscope, CRO '689': overskirt '690': oxcart '691': oxygen mask '692': packet '693': paddle, boat paddle '694': paddlewheel, paddle wheel '695': padlock '696': paintbrush '697': pajama, pyjama, pj's, jammies '698': palace '699': panpipe, pandean pipe, syrinx '700': paper towel '701': parachute, chute '702': parallel bars, bars '703': park bench '704': parking meter '705': passenger car, coach, carriage '706': patio, terrace '707': pay-phone, pay-station '708': pedestal, plinth, footstall '709': pencil box, pencil case '710': pencil sharpener '711': perfume, essence '712': Petri dish '713': photocopier '714': pick, plectrum, plectron '715': pickelhaube '716': picket fence, paling '717': pickup, pickup truck '718': pier '719': piggy bank, penny bank '720': pill bottle '721': pillow '722': ping-pong ball '723': pinwheel '724': pirate, pirate ship '725': pitcher, ewer '726': plane, carpenter's plane, woodworking plane '727': planetarium '728': plastic bag '729': plate rack '730': plow, plough '731': plunger, plumber's helper '732': Polaroid camera, Polaroid Land camera '733': pole '734': police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria '735': poncho '736': pool table, billiard table, snooker table '737': pop bottle, soda bottle '738': pot, flowerpot '739': potter's wheel '740': power drill '741': prayer rug, prayer mat '742': printer '743': prison, prison house '744': projectile, missile '745': projector '746': puck, hockey puck '747': punching bag, punch bag, punching ball, punchball '748': purse '749': quill, quill pen '750': quilt, comforter, comfort, puff '751': racer, race car, racing car '752': racket, racquet '753': radiator '754': radio, wireless '755': radio telescope, radio reflector '756': rain barrel '757': recreational vehicle, RV, R.V. '758': reel '759': reflex camera '760': refrigerator, icebox '761': remote control, remote '762': restaurant, eating house, eating place, eatery '763': revolver, six-gun, six-shooter '764': rifle '765': rocking chair, rocker '766': rotisserie '767': rubber eraser, rubber, pencil eraser '768': rugby ball '769': rule, ruler '770': running shoe '771': safe '772': safety pin '773': saltshaker, salt shaker '774': sandal '775': sarong '776': sax, saxophone '777': scabbard '778': scale, weighing machine '779': school bus '780': schooner '781': scoreboard '782': screen, CRT screen '783': screw '784': screwdriver '785': seat belt, seatbelt '786': sewing machine '787': shield, buckler '788': shoe shop, shoe-shop, shoe store '789': shoji '790': shopping basket '791': shopping cart '792': shovel '793': shower cap '794': shower curtain '795': ski '796': ski mask '797': sleeping bag '798': slide rule, slipstick '799': sliding door '800': slot, one-armed bandit '801': snorkel '802': snowmobile '803': snowplow, snowplough '804': soap dispenser '805': soccer ball '806': sock '807': solar dish, solar collector, solar furnace '808': sombrero '809': soup bowl '810': space bar '811': space heater '812': space shuttle '813': spatula '814': speedboat '815': spider web, spider's web '816': spindle '817': sports car, sport car '818': spotlight, spot '819': stage '820': steam locomotive '821': steel arch bridge '822': steel drum '823': stethoscope '824': stole '825': stone wall '826': stopwatch, stop watch '827': stove '828': strainer '829': streetcar, tram, tramcar, trolley, trolley car '830': stretcher '831': studio couch, day bed '832': stupa, tope '833': submarine, pigboat, sub, U-boat '834': suit, suit of clothes '835': sundial '836': sunglass '837': sunglasses, dark glasses, shades '838': sunscreen, sunblock, sun blocker '839': suspension bridge '840': swab, swob, mop '841': sweatshirt '842': swimming trunks, bathing trunks '843': swing '844': switch, electric switch, electrical switch '845': syringe '846': table lamp '847': tank, army tank, armored combat vehicle, armoured combat vehicle '848': tape player '849': teapot '850': teddy, teddy bear '851': television, television system '852': tennis ball '853': thatch, thatched roof '854': theater curtain, theatre curtain '855': thimble '856': thresher, thrasher, threshing machine '857': throne '858': tile roof '859': toaster '860': tobacco shop, tobacconist shop, tobacconist '861': toilet seat '862': torch '863': totem pole '864': tow truck, tow car, wrecker '865': toyshop '866': tractor '867': trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi '868': tray '869': trench coat '870': tricycle, trike, velocipede '871': trimaran '872': tripod '873': triumphal arch '874': trolleybus, trolley coach, trackless trolley '875': trombone '876': tub, vat '877': turnstile '878': typewriter keyboard '879': umbrella '880': unicycle, monocycle '881': upright, upright piano '882': vacuum, vacuum cleaner '883': vase '884': vault '885': velvet '886': vending machine '887': vestment '888': viaduct '889': violin, fiddle '890': volleyball '891': waffle iron '892': wall clock '893': wallet, billfold, notecase, pocketbook '894': wardrobe, closet, press '895': warplane, military plane '896': washbasin, handbasin, washbowl, lavabo, wash-hand basin '897': washer, automatic washer, washing machine '898': water bottle '899': water jug '900': water tower '901': whiskey jug '902': whistle '903': wig '904': window screen '905': window shade '906': Windsor tie '907': wine bottle '908': wing '909': wok '910': wooden spoon '911': wool, woolen, woollen '912': worm fence, snake fence, snake-rail fence, Virginia fence '913': wreck '914': yawl '915': yurt '916': web site, website, internet site, site '917': comic book '918': crossword puzzle, crossword '919': street sign '920': traffic light, traffic signal, stoplight '921': book jacket, dust cover, dust jacket, dust wrapper '922': menu '923': plate '924': guacamole '925': consomme '926': hot pot, hotpot '927': trifle '928': ice cream, icecream '929': ice lolly, lolly, lollipop, popsicle '930': French loaf '931': bagel, beigel '932': pretzel '933': cheeseburger '934': hotdog, hot dog, red hot '935': mashed potato '936': head cabbage '937': broccoli '938': cauliflower '939': zucchini, courgette '940': spaghetti squash '941': acorn squash '942': butternut squash '943': cucumber, cuke '944': artichoke, globe artichoke '945': bell pepper '946': cardoon '947': mushroom '948': Granny Smith '949': strawberry '950': orange '951': lemon '952': fig '953': pineapple, ananas '954': banana '955': jackfruit, jak, jack '956': custard apple '957': pomegranate '958': hay '959': carbonara '960': chocolate sauce, chocolate syrup '961': dough '962': meat loaf, meatloaf '963': pizza, pizza pie '964': potpie '965': burrito '966': red wine '967': espresso '968': cup '969': eggnog '970': alp '971': bubble '972': cliff, drop, drop-off '973': coral reef '974': geyser '975': lakeside, lakeshore '976': promontory, headland, head, foreland '977': sandbar, sand bar '978': seashore, coast, seacoast, sea-coast '979': valley, vale '980': volcano '981': ballplayer, baseball player '982': groom, bridegroom '983': scuba diver '984': rapeseed '985': daisy '986': yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum '987': corn '988': acorn '989': hip, rose hip, rosehip '990': buckeye, horse chestnut, conker '991': coral fungus '992': agaric '993': gyromitra '994': stinkhorn, carrion fungus '995': earthstar '996': hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa '997': bolete '998': ear, spike, capitulum '999': toilet tissue, toilet paper, bathroom tissue - name: cls_token sequence: sequence: float32 - name: patch_tokens sequence: sequence: sequence: float32 splits: - name: train num_bytes: 20224716800 num_examples: 12800 - name: validation num_bytes: 7900280000 num_examples: 5000 download_size: 23726734894 dataset_size: 28124996800 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* --- # Dataset Card for "imagenet_dino" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B
--- pretty_name: Evaluation run of hydra-project/ChatHercules-2.5-Mistral-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hydra-project/ChatHercules-2.5-Mistral-7B](https://huggingface.co/hydra-project/ChatHercules-2.5-Mistral-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-04T07:05:35.554823](https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B/blob/main/results_2024-03-04T07-05-35.554823.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6553363670600032,\n\ \ \"acc_stderr\": 0.03173032509061873,\n \"acc_norm\": 0.6567393158065117,\n\ \ \"acc_norm_stderr\": 0.032374391455577266,\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4752375097401301,\n\ \ \"mc2_stderr\": 0.01471475431086459\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414044,\n\ \ \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.013928933461382501\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n\ \ \"acc_stderr\": 0.0047750796365670966,\n \"acc_norm\": 0.8461461860187214,\n\ \ \"acc_norm_stderr\": 0.00360071170449341\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\ \ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741713,\n\ \ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741713\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\ \ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\ \ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\ \ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\ \ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\ \ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\ \ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8,\n \"acc_stderr\": 0.022755204959542943,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.022755204959542943\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\ \ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.01932180555722315,\n\ \ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.01932180555722315\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \ \ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \ \ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\ acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n\ \ \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n\ \ \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601457,\n\ \ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601457\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\ \ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\ \ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\ acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n\ \ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\ \ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \ \ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\ \ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\ \ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\ \ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\ \ \"acc_stderr\": 0.014593620923210746,\n \"acc_norm\": 0.2558659217877095,\n\ \ \"acc_norm_stderr\": 0.014593620923210746\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\ \ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\ \ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\ \ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n\ \ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\ \ \"acc_stderr\": 0.012761104871472653,\n \"acc_norm\": 0.4810951760104302,\n\ \ \"acc_norm_stderr\": 0.012761104871472653\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625173,\n\ \ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625173\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \ \ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4752375097401301,\n\ \ \"mc2_stderr\": 0.01471475431086459\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007493\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \ \ \"acc_stderr\": 0.013140409455571291\n }\n}\n```" repo_url: https://huggingface.co/hydra-project/ChatHercules-2.5-Mistral-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|arc:challenge|25_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-04T07-05-35.554823.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|gsm8k|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hellaswag|10_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-05-35.554823.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T07-05-35.554823.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|truthfulqa:mc|0_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-04T07-05-35.554823.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_04T07_05_35.554823 path: - '**/details_harness|winogrande|5_2024-03-04T07-05-35.554823.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-04T07-05-35.554823.parquet' - config_name: results data_files: - split: 2024_03_04T07_05_35.554823 path: - results_2024-03-04T07-05-35.554823.parquet - split: latest path: - results_2024-03-04T07-05-35.554823.parquet --- # Dataset Card for Evaluation run of hydra-project/ChatHercules-2.5-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [hydra-project/ChatHercules-2.5-Mistral-7B](https://huggingface.co/hydra-project/ChatHercules-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-04T07:05:35.554823](https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B/blob/main/results_2024-03-04T07-05-35.554823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6553363670600032, "acc_stderr": 0.03173032509061873, "acc_norm": 0.6567393158065117, "acc_norm_stderr": 0.032374391455577266, "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.4752375097401301, "mc2_stderr": 0.01471475431086459 }, "harness|arc:challenge|25": { "acc": 0.6117747440273038, "acc_stderr": 0.014241614207414044, "acc_norm": 0.6510238907849829, "acc_norm_stderr": 0.013928933461382501 }, "harness|hellaswag|10": { "acc": 0.6450906193985262, "acc_stderr": 0.0047750796365670966, "acc_norm": 0.8461461860187214, "acc_norm_stderr": 0.00360071170449341 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7236842105263158, "acc_stderr": 0.03639057569952929, "acc_norm": 0.7236842105263158, "acc_norm_stderr": 0.03639057569952929 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7358490566037735, "acc_stderr": 0.027134291628741713, "acc_norm": 0.7358490566037735, "acc_norm_stderr": 0.027134291628741713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.0349610148119118, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.0349610148119118 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6127659574468085, "acc_stderr": 0.03184389265339526, "acc_norm": 0.6127659574468085, "acc_norm_stderr": 0.03184389265339526 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542943, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542943 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5369458128078818, "acc_stderr": 0.035083705204426656, "acc_norm": 0.5369458128078818, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790482, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790482 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.01932180555722315, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.01932180555722315 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3851851851851852, "acc_stderr": 0.029670906124630882, "acc_norm": 0.3851851851851852, "acc_norm_stderr": 0.029670906124630882 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634335, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634335 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.01570349834846177, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.01570349834846177 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601457, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822915, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822915 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917671, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917671 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092365, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2558659217877095, "acc_stderr": 0.014593620923210746, "acc_norm": 0.2558659217877095, "acc_norm_stderr": 0.014593620923210746 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.02440439492808787, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7592592592592593, "acc_stderr": 0.02378858355165854, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.02378858355165854 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4810951760104302, "acc_stderr": 0.012761104871472653, "acc_norm": 0.4810951760104302, "acc_norm_stderr": 0.012761104871472653 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7242647058823529, "acc_stderr": 0.027146271936625173, "acc_norm": 0.7242647058823529, "acc_norm_stderr": 0.027146271936625173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069446, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069446 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.4752375097401301, "mc2_stderr": 0.01471475431086459 }, "harness|winogrande|5": { "acc": 0.8184688239936859, "acc_stderr": 0.010833276515007493 }, "harness|gsm8k|5": { "acc": 0.6497346474601972, "acc_stderr": 0.013140409455571291 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ura-hcmut/orca_dpo_pairs
--- license: cc-by-sa-4.0 language: - vi - en size_categories: - 10K<n<100K configs: - config_name: original data_files: - split: train path: orca_dpo_pairs_train.csv - config_name: clean data_files: - split: train path: orca_dpo_pairs_train_filtered.csv - config_name: dropped data_files: - split: train path: orca_dpo_pairs_train_dropped.csv ---
Richardson23/23
--- license: apache-2.0 ---
huggingartists/ddt
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/ddt" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 0.057123 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.rapgenius.com/avatars/medium/f258b58a22ea31bb81b73395c47e5ba4&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/ddt"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">DDT</div> <a href="https://genius.com/artists/ddt"> <div style="text-align: center; font-size: 14px;">@ddt</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/ddt). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/ddt") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |20| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/ddt") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
CyberHarem/san_francisco_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of san_francisco/サンフランシスコ/旧金山 (Azur Lane) This is the dataset of san_francisco/サンフランシスコ/旧金山 (Azur Lane), containing 43 images and their tags. The core tags of this character are `breasts, long_hair, multicolored_hair, purple_eyes, twintails, grey_hair, hair_horns, streaked_hair, ribbon, hair_ribbon, large_breasts, bangs, hair_bun, black_ribbon, very_long_hair, hair_between_eyes, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 43 | 84.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_francisco_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 43 | 38.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_francisco_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 111 | 87.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_francisco_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 43 | 68.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_francisco_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 111 | 145.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_francisco_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/san_francisco_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 31 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, cleavage, solo, smile, bare_shoulders, open_jacket, black_gloves, blue_jacket, fingerless_gloves, black_nails, nail_polish, open_mouth, simple_background, baseball_bat, blush, white_background, long_sleeves, two-tone_hair, covered_navel, thighhighs, holding, off_shoulder, double_bun, standing | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, grin, looking_at_viewer, midriff, navel, rabbit_ears, solo, arms_up, crop_top, fake_animal_ears, suspenders, white_skirt, bowtie, braid, detached_collar, fishnets, miniskirt, nail_polish, stomach, armpits, bare_shoulders, black_hair, pantyhose, pink_eyes, sitting | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | cleavage | solo | smile | bare_shoulders | open_jacket | black_gloves | blue_jacket | fingerless_gloves | black_nails | nail_polish | open_mouth | simple_background | baseball_bat | blush | white_background | long_sleeves | two-tone_hair | covered_navel | thighhighs | holding | off_shoulder | double_bun | standing | grin | midriff | navel | rabbit_ears | arms_up | crop_top | fake_animal_ears | suspenders | white_skirt | bowtie | braid | detached_collar | fishnets | miniskirt | stomach | armpits | black_hair | pantyhose | pink_eyes | sitting | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------|:-------|:--------|:-----------------|:--------------|:---------------|:--------------|:--------------------|:--------------|:--------------|:-------------|:--------------------|:---------------|:--------|:-------------------|:---------------|:----------------|:----------------|:-------------|:----------|:---------------|:-------------|:-----------|:-------|:----------|:--------|:--------------|:----------|:-----------|:-------------------|:-------------|:--------------|:---------|:--------|:------------------|:-----------|:------------|:----------|:----------|:-------------|:------------|:------------|:----------| | 0 | 31 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
GAIR/ReAlign-Open-Platypus
--- task_categories: - question-answering - conversational language: - en size_categories: - 10K<n<100K --- Please refer to our [GitHub repo](https://github.com/GAIR-NLP/ReAlign) for more details.
EddSB/Chains_v1.1
--- dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: train num_bytes: 64016244.0 num_examples: 66 download_size: 6047586 dataset_size: 64016244.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
LinhDuong/chatdoctor-5k
--- license: apache-2.0 --- This ChatDoctor-5K dataset is collected from this paper https://arxiv.org/pdf/2303.14070.pdf Alternatively, you can download the original dataset from this link https://drive.google.com/file/d/1nDTKZ3wZbZWTkFMBkxlamrzbNz0frugg/view?usp=sharing
open-llm-leaderboard/details_Cartinoe5930__iDUS
--- pretty_name: Evaluation run of Cartinoe5930/iDUS dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Cartinoe5930/iDUS](https://huggingface.co/Cartinoe5930/iDUS) on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__iDUS\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-17T13:14:26.897278](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS/blob/main/results_2024-01-17T13-14-26.897278.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24846552462908655,\n\ \ \"acc_stderr\": 0.030596938174151714,\n \"acc_norm\": 0.24984616637570042,\n\ \ \"acc_norm_stderr\": 0.031417483125595794,\n \"mc1\": 0.22276621787025705,\n\ \ \"mc1_stderr\": 0.014566506961396754,\n \"mc2\": 0.48577541497626797,\n\ \ \"mc2_stderr\": 0.016589496055636796\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.20733788395904437,\n \"acc_stderr\": 0.011846905782971352,\n\ \ \"acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26020713005377416,\n\ \ \"acc_stderr\": 0.004378508362084367,\n \"acc_norm\": 0.2664807807209719,\n\ \ \"acc_norm_stderr\": 0.004412149415717922\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \ \ \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"\ acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n\ \ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n\ \ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.02818544130123409,\n\ \ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.02818544130123409\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\ \ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\ \ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.031967664333731854,\n\ \ \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.031967664333731854\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\ acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\ \ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\ \ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\ \ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18064516129032257,\n\ \ \"acc_stderr\": 0.021886178567172548,\n \"acc_norm\": 0.18064516129032257,\n\ \ \"acc_norm_stderr\": 0.021886178567172548\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n\ \ \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\ : 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\ \ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\ acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\ \ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\ \ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \ \ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\ \ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.20917431192660552,\n \"acc_stderr\": 0.01743793717334323,\n \"\ acc_norm\": 0.20917431192660552,\n \"acc_norm_stderr\": 0.01743793717334323\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\ acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \ \ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\ \ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\ \ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\ \ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\ \ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n\ \ \"acc_stderr\": 0.016160871405127522,\n \"acc_norm\": 0.28607918263090676,\n\ \ \"acc_norm_stderr\": 0.016160871405127522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02392915551735128,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02392915551735128\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\ \ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \ \ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\ \ \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n\ \ \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244034,\n\ \ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244034\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\ \ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\ \ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\ \ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\ \ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\ \ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\ \ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\ \ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\ \ \"mc1_stderr\": 0.014566506961396754,\n \"mc2\": 0.48577541497626797,\n\ \ \"mc2_stderr\": 0.016589496055636796\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.49171270718232046,\n \"acc_stderr\": 0.014050555322824194\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/Cartinoe5930/iDUS leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|arc:challenge|25_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-17T13-14-26.897278.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|gsm8k|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hellaswag|10_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|truthfulqa:mc|0_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-17T13-14-26.897278.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_17T13_14_26.897278 path: - '**/details_harness|winogrande|5_2024-01-17T13-14-26.897278.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-17T13-14-26.897278.parquet' - config_name: results data_files: - split: 2024_01_17T13_14_26.897278 path: - results_2024-01-17T13-14-26.897278.parquet - split: latest path: - results_2024-01-17T13-14-26.897278.parquet --- # Dataset Card for Evaluation run of Cartinoe5930/iDUS <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Cartinoe5930/iDUS](https://huggingface.co/Cartinoe5930/iDUS) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__iDUS", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-17T13:14:26.897278](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS/blob/main/results_2024-01-17T13-14-26.897278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24846552462908655, "acc_stderr": 0.030596938174151714, "acc_norm": 0.24984616637570042, "acc_norm_stderr": 0.031417483125595794, "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396754, "mc2": 0.48577541497626797, "mc2_stderr": 0.016589496055636796 }, "harness|arc:challenge|25": { "acc": 0.20733788395904437, "acc_stderr": 0.011846905782971352, "acc_norm": 0.2773037542662116, "acc_norm_stderr": 0.013082095839059374 }, "harness|hellaswag|10": { "acc": 0.26020713005377416, "acc_stderr": 0.004378508362084367, "acc_norm": 0.2664807807209719, "acc_norm_stderr": 0.004412149415717922 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2, "acc_stderr": 0.034554737023254366, "acc_norm": 0.2, "acc_norm_stderr": 0.034554737023254366 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.32894736842105265, "acc_stderr": 0.03823428969926604, "acc_norm": 0.32894736842105265, "acc_norm_stderr": 0.03823428969926604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.025288394502891366, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.025288394502891366 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383889, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383889 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.24680851063829787, "acc_stderr": 0.02818544130123409, "acc_norm": 0.24680851063829787, "acc_norm_stderr": 0.02818544130123409 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21052631578947367, "acc_stderr": 0.038351539543994194, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.038351539543994194 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.1793103448275862, "acc_stderr": 0.031967664333731854, "acc_norm": 0.1793103448275862, "acc_norm_stderr": 0.031967664333731854 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.022789673145776564, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.022789673145776564 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.18064516129032257, "acc_stderr": 0.021886178567172548, "acc_norm": 0.18064516129032257, "acc_norm_stderr": 0.021886178567172548 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.03287666758603489, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.03287666758603489 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.21851851851851853, "acc_stderr": 0.02519575225182379, "acc_norm": 0.21851851851851853, "acc_norm_stderr": 0.02519575225182379 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02665353159671549, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02665353159671549 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.20917431192660552, "acc_stderr": 0.01743793717334323, "acc_norm": 0.20917431192660552, "acc_norm_stderr": 0.01743793717334323 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604246, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.20253164556962025, "acc_stderr": 0.026160568246601457, "acc_norm": 0.20253164556962025, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.04327040932578729, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.04327040932578729 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909281, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.28607918263090676, "acc_stderr": 0.016160871405127522, "acc_norm": 0.28607918263090676, "acc_norm_stderr": 0.016160871405127522 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.02392915551735128, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.02392915551735128 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24902216427640156, "acc_stderr": 0.01104489226404077, "acc_norm": 0.24902216427640156, "acc_norm_stderr": 0.01104489226404077 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19117647058823528, "acc_stderr": 0.02388688192244034, "acc_norm": 0.19117647058823528, "acc_norm_stderr": 0.02388688192244034 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24509803921568626, "acc_stderr": 0.01740181671142765, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.01740181671142765 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0312678171466318, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396754, "mc2": 0.48577541497626797, "mc2_stderr": 0.016589496055636796 }, "harness|winogrande|5": { "acc": 0.49171270718232046, "acc_stderr": 0.014050555322824194 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
jaykin01/advertisement-copy
--- license: unknown ---
javilonso/restmex23-test
--- dataset_info: features: - name: ID dtype: int64 - name: Title_Review dtype: string splits: - name: train num_bytes: 43836074 num_examples: 107863 download_size: 27668633 dataset_size: 43836074 --- # Dataset Card for "restmex23-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ufukhaman/uspto_balanced_200k_ipc_classification
--- annotations_creators: - USPTO language: - en license: - mit pretty_name: uspto_balanced_filtered_200k_ipc_patents size_categories: - 100K<n<1M source_datasets: - USPTO tags: - patent - refined_patents - patent classification - uspto - ipc task_categories: - text-classification task_ids: - topic-classification ---
gzguevara/cat_toy_masked
--- dataset_info: features: - name: prompt dtype: string - name: image dtype: image - name: mask_0 dtype: image - name: mask_1 dtype: image - name: mask_2 dtype: image splits: - name: train num_bytes: 1720118.0 num_examples: 4 - name: test num_bytes: 647304.0 num_examples: 2 download_size: 2446978 dataset_size: 2367422.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
Pelmeshek/twitter_liberal
--- task_categories: - sentence-similarity - text-classification - text-generation - token-classification language: - en tags: - Twitter - Feminism - LGBT - Cultural - Toxic - Social Network - Polics size_categories: - 1K<n<10K --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset is compiled from 12 different X(formerly Twitter) accounts targeting LGBT, Feminism, Toxicity and some of their related topics. <!-- Provide a longer summary of what this dataset is. --> The dataset itself represents raw user messages and may require additional preprocessing, depending on your task - **Created by:** [Pelmeshek](https://github.com/Pelmeshek1706)
juletxara/visual-spatial-reasoning
--- annotations_creators: - crowdsourced language: - en language_creators: - machine-generated license: - apache-2.0 multilinguality: - monolingual pretty_name: Visual Spatial Reasoning size_categories: - 10K<n<100K source_datasets: - original tags: [] task_categories: - image-classification task_ids: [] --- # Dataset Card for Visual Spatial Reasoning ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://ltl.mmll.cam.ac.uk/ - **Repository:** https://github.com/cambridgeltl/visual-spatial-reasoning - **Paper:** https://arxiv.org/abs/2205.00363 - **Leaderboard:** https://paperswithcode.com/sota/visual-reasoning-on-vsr - **Point of Contact:** https://ltl.mmll.cam.ac.uk/ ### Dataset Summary The Visual Spatial Reasoning (VSR) corpus is a collection of caption-image pairs with true/false labels. Each caption describes the spatial relation of two individual objects in the image, and a vision-language model (VLM) needs to judge whether the caption is correctly describing the image (True) or not (False). ### Supported Tasks and Leaderboards We test three baselines, all supported in huggingface. They are VisualBERT [(Li et al. 2019)](https://arxiv.org/abs/1908.03557), LXMERT [(Tan and Bansal, 2019)](https://arxiv.org/abs/1908.07490) and ViLT [(Kim et al. 2021)](https://arxiv.org/abs/2102.03334). The leaderboard can be checked at [Papers With Code](https://paperswithcode.com/sota/visual-reasoning-on-vsr). model | random split | zero-shot :-------------|:-------------:|:-------------: *human* | *95.4* | *95.4* VisualBERT | 57.4 | 54.0 LXMERT | **72.5** | **63.2** ViLT | 71.0 | 62.4 ### Languages The language in the dataset is English as spoken by the annotators. The BCP-47 code for English is en. [`meta_data.csv`](https://github.com/cambridgeltl/visual-spatial-reasoning/tree/master/data/data_files/meta_data.jsonl) contains meta data of annotators. ## Dataset Structure ### Data Instances Each line is an individual data point. Each `jsonl` file is of the following format: ```json {"image": "000000050403.jpg", "image_link": "http://images.cocodataset.org/train2017/000000050403.jpg", "caption": "The teddy bear is in front of the person.", "label": 1, "relation": "in front of", "annotator_id": 31, "vote_true_validator_id": [2, 6], "vote_false_validator_id": []} {"image": "000000401552.jpg", "image_link": "http://images.cocodataset.org/train2017/000000401552.jpg", "caption": "The umbrella is far away from the motorcycle.", "label": 0, "relation": "far away from", "annotator_id": 2, "vote_true_validator_id": [], "vote_false_validator_id": [2, 9, 1]} ``` ### Data Fields `image` denotes name of the image in COCO and `image_link` points to the image on the COCO server (so you can also access directly). `caption` is self-explanatory. `label` being `0` and `1` corresponds to False and True respectively. `relation` records the spatial relation used. `annotator_id` points to the annotator who originally wrote the caption. `vote_true_validator_id` and `vote_false_validator_id` are annotators who voted True or False in the second phase validation. ### Data Splits The VSR corpus, after validation, contains 10,119 data points with high agreement. On top of these, we create two splits (1) random split and (2) zero-shot split. For random split, we randomly split all data points into train, development, and test sets. Zero-shot split makes sure that train, development and test sets have no overlap of concepts (i.e., if *dog* is in test set, it is not used for training and development). Below are some basic statistics of the two splits. split | train | dev | test | total :------|:--------:|:--------:|:--------:|:--------: random | 7,083 | 1,012 | 2,024 | 10,119 zero-shot | 5,440 | 259 | 731 | 6,430 Check out [`data/`](https://github.com/cambridgeltl/visual-spatial-reasoning/tree/master/data) for more details. ## Dataset Creation ### Curation Rationale Understanding spatial relations is fundamental to achieve intelligence. Existing vision-language reasoning datasets are great but they compose multiple types of challenges and can thus conflate different sources of error. The VSR corpus focuses specifically on spatial relations so we can have accurate diagnosis and maximum interpretability. ### Source Data #### Initial Data Collection and Normalization **Image pair sampling.** MS COCO 2017 contains 123,287 images and has labelled the segmentation and classes of 886,284 instances (individual objects). Leveraging the segmentation, we first randomly select two concepts, then retrieve all images containing the two concepts in COCO 2017 (train and validation sets). Then images that contain multiple instances of any of the concept are filtered out to avoid referencing ambiguity. For the single-instance images, we also filter out any of the images with instance area size < 30, 000, to prevent extremely small instances. After these filtering steps, we randomly sample a pair in the remaining images. We repeat such process to obtain a large number of individual image pairs for caption generation. #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process **Fill in the blank: template-based caption generation.** Given a pair of images, the annotator needs to come up with a valid caption that makes it correctly describing one image but incorrect for the other. In this way, the annotator could focus on the key difference of the two images (which should be spatial relation of the two objects of interest) and come up with challenging relation that differentiates the two. Similar paradigms are also used in the annotation of previous vision-language reasoning datasets such as NLVR2 (Suhr et al., 2017, 2019) and MaRVL (Liu et al., 2021). To regularise annotators from writing modifiers and differentiating the image pair with things beyond accurate spatial relations, we opt for a template-based classification task instead of free-form caption writing. Besides, the template-generated dataset can be easily categorised based on relations and their meta-categories. The caption template has the format of “The `OBJ1` (is) __ the `OBJ2`.”, and the annotators are instructed to select a relation from a fixed set to fill in the slot. The copula “is” can be omitted for grammaticality. For example, for “contains”, “consists of”, and “has as a part”, “is” should be discarded in the template when extracting the final caption. The fixed set of spatial relations enable us to obtain the full control of the generation process. The full list of used relations are listed in the table below. It contains 71 spatial relations and is adapted from the summarised relation table of Fagundes et al. (2021). We made minor changes to filter out clearly unusable relations, made relation names grammatical under our template, and reduced repeated relations. In our final dataset, 65 out of the 71 available relations are actually included (the other 6 are either not selected by annotators or are selected but the captions did not pass the validation phase). | Category | Spatial Relations | |-------------|-------------------------------------------------------------------------------------------------------------------------------------------------| | Adjacency | Adjacent to, alongside, at the side of, at the right side of, at the left side of, attached to, at the back of, ahead of, against, at the edge of | | Directional | Off, past, toward, down, deep down*, up*, away from, along, around, from*, into, to*, across, across from, through*, down from | | Orientation | Facing, facing away from, parallel to, perpendicular to | | Projective | On top of, beneath, beside, behind, left of, right of, under, in front of, below, above, over, in the middle of | | Proximity | By, close to, near, far from, far away from | | Topological | Connected to, detached from, has as a part, part of, contains, within, at, on, in, with, surrounding, among, consists of, out of, between, inside, outside, touching | | Unallocated | Beyond, next to, opposite to, after*, among, enclosed by | **Second-round Human Validation.** Every annotated data point is reviewed by at least two additional human annotators (validators). In validation, given a data point (consists of an image and a caption), the validator gives either a True or False label. We exclude data points that have < 2/3 validators agreeing with the original label. In the guideline, we communicated to the validators that, for relations such as “left”/“right”, “in front of”/“behind”, they should tolerate different reference frame: i.e., if the caption is true from either the object’s or the viewer’s reference, it should be given a True label. Only when the caption is incorrect under all reference frames, a False label is assigned. This adds difficulty to the models since they could not naively rely on relative locations of the objects in the images but also need to correctly identify orientations of objects to make the best judgement. #### Who are the annotators? Annotators are hired from [prolific.co](https://prolific.co). We require them (1) have at least a bachelor’s degree, (2) are fluent in English or native speaker, and (3) have a >99% historical approval rate on the platform. All annotators are paid with an hourly salary of 12 GBP. Prolific takes an extra 33% of service charge and 20% VAT on the service charge. For caption generation, we release the task with batches of 200 instances and the annotator is required to finish a batch in 80 minutes. An annotator cannot take more than one batch per day. In this way we have a diverse set of annotators and can also prevent annotators from being fatigued. For second round validation, we group 500 data points in one batch and an annotator is asked to label each batch in 90 minutes. In total, 24 annotators participated in caption generation and 26 participated in validation. The annotators have diverse demographic background: they were born in 13 different countries; live in 13 different couturiers; and have 14 different nationalities. 57.4% of the annotators identify themselves as females and 42.6% as males. ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information This project is licensed under the [Apache-2.0 License](https://github.com/cambridgeltl/visual-spatial-reasoning/blob/master/LICENSE). ### Citation Information ```bibtex @article{Liu2022VisualSR, title={Visual Spatial Reasoning}, author={Fangyu Liu and Guy Edward Toh Emerson and Nigel Collier}, journal={ArXiv}, year={2022}, volume={abs/2205.00363} } ``` ### Contributions Thanks to [@juletx](https://github.com/juletx) for adding this dataset.
thiomajid/exp
--- dataset_info: features: - name: prompt dtype: string - name: answer dtype: string splits: - name: apologetic num_bytes: 614448 num_examples: 558 - name: non_apologetic num_bytes: 9441420 num_examples: 5047 download_size: 4301960 dataset_size: 10055868 --- # Dataset Card for "exp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jlbaker361/avatar-lite
--- dataset_info: features: - name: image dtype: image - name: src dtype: string - name: split dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 1595867500.125 num_examples: 2031 download_size: 1595634006 dataset_size: 1595867500.125 --- # Dataset Card for "avatar-lite" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
distilled-from-one-sec-cv12/chunk_136
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1171640732 num_examples: 228301 download_size: 1196091420 dataset_size: 1171640732 --- # Dataset Card for "chunk_136" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
edu745181/trivia_qa
--- license: apache-2.0 ---
betteruncensored/VMware-open-instruct
--- dataset_info: features: - name: alpaca_prompt dtype: string - name: response dtype: string - name: instruction dtype: string - name: source dtype: string - name: task_name dtype: string - name: template_type dtype: string splits: - name: train num_bytes: 125656035 num_examples: 142622 download_size: 57912402 dataset_size: 125656035 license: cc-by-3.0 task_categories: - text-generation - text2text-generation language: - en pretty_name: T size_categories: - 100K<n<1M --- # Dataset Card for "open-instruct" Better Uncensored This is the VMWare/open-instruct dataset processed with the Better Uncensored pipeline. A bit more than 4000 records were censored, less than 5%. Format kepts the same for compatibility. # Dataset Card for "open-instruct" This dataset is a combination of: 1. Filtered subset of [OpenAssistant/oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1) 2. train split of [Mosaic-dolly-hhrlhf](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) (consists of [Databrick's dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) dataset and a filtered subset of [Anthropic's HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf)). 3. Filtered subset of [conceptofmind/cot_submix_original](https://huggingface.co/datasets/conceptofmind/cot_submix_original) ## Dataset The dataset consists of 6 columns: 1. instruction: The natural language instruction without any prompt templates (we extracted them out of the alpaca-format in Mosaic-dolly-hhrlhf) 2. alpaca_prompt: Alpaca prompt template versions of instruction 3. response: The response to the instruction 4. source: Dataset source 5. task_name 6. template_type: flan template used (zeroshot or fewshot) ## License - It is usable for commercial purposes so long as you follow the terms of the license. ### Dataset subset licenses: - Open-instruct-v1-dolly-hhrlhf-oasst1 (Mosaic/Dolly-HHRLHF + filtered OASST1) - cc by 3.0 Subset of COT SUBMIX (FROM FLAN V2) Zeroshot examples: - ESNLI - MIT - ECQA - CDLA 1.0 - Sharing - Strategy - MIT - CREAK - MIT - gsmk8 - MIT - aqua - MIT - qasc - Apache 2.0 Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license: Wikipedia (various pages) - https://www.wikipedia.org/ - Copyright © Wikipedia editors and contributors. Databricks (https://www.databricks.com) - Copyright © Databricks Mosaic ML (https://www.mosaicml.com/) - Copyright © Mosaic ML VMware - Copyright © VMware [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SaiedAlshahrani/Moroccan_Arabic_Wikipedia_20230101_nobots
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 7334642 num_examples: 4675 download_size: 2883783 dataset_size: 7334642 license: mit language: - ar pretty_name: arywiki-articles-withoutbots size_categories: - 1K<n<10K --- # Dataset Card for "Moroccan_Arabic_Wikipedia_20230101_nobots" This dataset is created using the Moroccan Arabic Wikipedia articles (**after removing bot-generated articles**), downloaded on the 1st of January 2023, processed using `Gensim` Python library, and preprocessed using `tr` Linux/Unix utility and `CAMeLTools` Python toolkit for Arabic NLP. This dataset was used to train this Moroccan Arabic Wikipedia Masked Language Model: [SaiedAlshahrani/arywiki_20230101_roberta_mlm_nobots](https://huggingface.co/SaiedAlshahrani/arywiki_20230101_roberta_mlm_nobots). For more details about the dataset, please **read** and **cite** our paper: ```bash @inproceedings{alshahrani-etal-2023-performance, title = "{Performance Implications of Using Unrepresentative Corpora in {A}rabic Natural Language Processing}", author = "Alshahrani, Saied and Alshahrani, Norah and Dey, Soumyabrata and Matthews, Jeanna", booktitle = "Proceedings of the The First Arabic Natural Language Processing Conference (ArabicNLP 2023)", month = December, year = "2023", address = "Singapore (Hybrid)", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2023.arabicnlp-1.19", doi = "10.18653/v1/2023.arabicnlp-1.19", pages = "218--231", abstract = "Wikipedia articles are a widely used source of training data for Natural Language Processing (NLP) research, particularly as corpora for low-resource languages like Arabic. However, it is essential to understand the extent to which these corpora reflect the representative contributions of native speakers, especially when many entries in a given language are directly translated from other languages or automatically generated through automated mechanisms. In this paper, we study the performance implications of using inorganic corpora that are not representative of native speakers and are generated through automated techniques such as bot generation or automated template-based translation. The case of the Arabic Wikipedia editions gives a unique case study of this since the Moroccan Arabic Wikipedia edition (ARY) is small but representative, the Egyptian Arabic Wikipedia edition (ARZ) is large but unrepresentative, and the Modern Standard Arabic Wikipedia edition (AR) is both large and more representative. We intrinsically evaluate the performance of two main NLP upstream tasks, namely word representation and language modeling, using word analogy evaluations and fill-mask evaluations using our two newly created datasets: Arab States Analogy Dataset (ASAD) and Masked Arab States Dataset (MASD). We demonstrate that for good NLP performance, we need both large and organic corpora; neither alone is sufficient. We show that producing large corpora through automated means can be a counter-productive, producing models that both perform worse and lack cultural richness and meaningful representation of the Arabic language and its native speakers.", } ```
rohitsuv/tokenized-codeparrot-train-verilog
--- dataset_info: features: - name: input_ids sequence: int32 - name: ratio_char_token dtype: float64 splits: - name: train num_bytes: 3664280 num_examples: 5906 download_size: 879597 dataset_size: 3664280 --- # Dataset Card for "tokenized-codeparrot-train-verilog" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Nexdata/6020000_Groups_Chinese_French_Parallel_Corpus_Data
--- license: cc-by-nc-nd-4.0 --- ## Description 1 Million Pairs of Sentences - Chinese-French Parallel Corpus Data be stored in txt format. It covers multiple fields such as tourism, medical treatment, daily life, TV play, etc. The data desensitization and quality checking had been done. It can be used as a basic corpus for text data analysis in fields such as machine translation. For more details, please refer to the link: https://www.nexdata.ai/dataset/1070?source=Huggingface # Specifications ## Format TXT ## Data content Chinese-French Parallel Corpus ## Data size 6.02 million pairs of Chinese-French Parallel Corpus Data. The Chinese sentences contain 14.8 characters on average. ## Language Chinese, French ## Applications machine translation ## Accuracy rate 90% # Licensing Information Commercial License
Atipico1/SQuAD_under_150
--- dataset_info: features: - name: id dtype: string - name: title dtype: string - name: context dtype: string - name: question dtype: string - name: answers struct: - name: answer_start sequence: int64 - name: text sequence: string - name: masked_query dtype: string - name: query_embedding sequence: float64 splits: - name: train num_bytes: 640360374.9375818 num_examples: 90871 download_size: 470992122 dataset_size: 640360374.9375818 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_sumo43__Yi-34b-x2
--- pretty_name: Evaluation run of sumo43/Yi-34b-x2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [sumo43/Yi-34b-x2](https://huggingface.co/sumo43/Yi-34b-x2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sumo43__Yi-34b-x2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-15T15:14:57.925776](https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__Yi-34b-x2/blob/main/results_2024-01-15T15-14-57.925776.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7619039072175088,\n\ \ \"acc_stderr\": 0.028287211728702223,\n \"acc_norm\": 0.7672823242487893,\n\ \ \"acc_norm_stderr\": 0.028809880216771097,\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7210377690522727,\n\ \ \"mc2_stderr\": 0.014187472355015407\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n\ \ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.665803624775941,\n\ \ \"acc_stderr\": 0.004707447244200623,\n \"acc_norm\": 0.8570005974905397,\n\ \ \"acc_norm_stderr\": 0.0034935679140932928\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\ \ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\ \ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n\ \ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\ \ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \ \ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n\ \ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.026280550932848052,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.026280550932848052\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\ \ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n\ \ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n\ \ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.04913595201274503,\n\ \ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.04913595201274503\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\ \ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.027501752944412417,\n\ \ \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.027501752944412417\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\ \ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\ \ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n\ \ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.7116402116402116,\n \"acc_stderr\": 0.023330654054535903,\n \"\ acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.023330654054535903\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"\ acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\"\ : 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n\ \ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\ : 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n\ \ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706456,\n\ \ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706456\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"\ acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\ \ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \ \ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.45185185185185184,\n \"acc_stderr\": 0.030343862998512623,\n \ \ \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.030343862998512623\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805791,\n \ \ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805791\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\ acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"\ acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.6666666666666666,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\ : 0.6666666666666666,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n\ \ \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n\ \ \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n\ \ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\ \ \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n\ \ \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\ \ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"\ acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\ \ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\ \ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n\ \ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\ \ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\ \ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\ \ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \ \ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n\ \ \"acc_stderr\": 0.01052403107905584,\n \"acc_norm\": 0.9042145593869731,\n\ \ \"acc_norm_stderr\": 0.01052403107905584\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423203,\n\ \ \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423203\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7966480446927374,\n\ \ \"acc_stderr\": 0.013461351487507524,\n \"acc_norm\": 0.7966480446927374,\n\ \ \"acc_norm_stderr\": 0.013461351487507524\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043718,\n\ \ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043718\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8167202572347267,\n\ \ \"acc_stderr\": 0.02197419884826582,\n \"acc_norm\": 0.8167202572347267,\n\ \ \"acc_norm_stderr\": 0.02197419884826582\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n\ \ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \ \ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n\ \ \"acc_stderr\": 0.012602244505788228,\n \"acc_norm\": 0.5808344198174706,\n\ \ \"acc_norm_stderr\": 0.012602244505788228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581774,\n\ \ \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581774\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \ \ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265015,\n\ \ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265015\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\ \ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\ \ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \ \ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.023537557657892567,\n\ \ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.023537557657892567\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7210377690522727,\n\ \ \"mc2_stderr\": 0.014187472355015407\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247001\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \ \ \"acc_stderr\": 0.013491660298815988\n }\n}\n```" repo_url: https://huggingface.co/sumo43/Yi-34b-x2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|arc:challenge|25_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-15T15-14-57.925776.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|gsm8k|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hellaswag|10_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-15T15-14-57.925776.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-management|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T15-14-57.925776.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|truthfulqa:mc|0_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-15T15-14-57.925776.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_15T15_14_57.925776 path: - '**/details_harness|winogrande|5_2024-01-15T15-14-57.925776.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-15T15-14-57.925776.parquet' - config_name: results data_files: - split: 2024_01_15T15_14_57.925776 path: - results_2024-01-15T15-14-57.925776.parquet - split: latest path: - results_2024-01-15T15-14-57.925776.parquet --- # Dataset Card for Evaluation run of sumo43/Yi-34b-x2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [sumo43/Yi-34b-x2](https://huggingface.co/sumo43/Yi-34b-x2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_sumo43__Yi-34b-x2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-15T15:14:57.925776](https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__Yi-34b-x2/blob/main/results_2024-01-15T15-14-57.925776.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7619039072175088, "acc_stderr": 0.028287211728702223, "acc_norm": 0.7672823242487893, "acc_norm_stderr": 0.028809880216771097, "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7210377690522727, "mc2_stderr": 0.014187472355015407 }, "harness|arc:challenge|25": { "acc": 0.6979522184300341, "acc_stderr": 0.013417519144716417, "acc_norm": 0.7286689419795221, "acc_norm_stderr": 0.012993807727545796 }, "harness|hellaswag|10": { "acc": 0.665803624775941, "acc_stderr": 0.004707447244200623, "acc_norm": 0.8570005974905397, "acc_norm_stderr": 0.0034935679140932928 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7481481481481481, "acc_stderr": 0.03749850709174021, "acc_norm": 0.7481481481481481, "acc_norm_stderr": 0.03749850709174021 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.868421052631579, "acc_stderr": 0.027508689533549912, "acc_norm": 0.868421052631579, "acc_norm_stderr": 0.027508689533549912 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8075471698113208, "acc_stderr": 0.024262979839372274, "acc_norm": 0.8075471698113208, "acc_norm_stderr": 0.024262979839372274 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8888888888888888, "acc_stderr": 0.026280550932848052, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.026280550932848052 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7572254335260116, "acc_stderr": 0.0326926380614177, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5784313725490197, "acc_stderr": 0.04913595201274503, "acc_norm": 0.5784313725490197, "acc_norm_stderr": 0.04913595201274503 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7702127659574468, "acc_stderr": 0.027501752944412417, "acc_norm": 0.7702127659574468, "acc_norm_stderr": 0.027501752944412417 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5964912280701754, "acc_stderr": 0.04615186962583707, "acc_norm": 0.5964912280701754, "acc_norm_stderr": 0.04615186962583707 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7241379310344828, "acc_stderr": 0.037245636197746304, "acc_norm": 0.7241379310344828, "acc_norm_stderr": 0.037245636197746304 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7116402116402116, "acc_stderr": 0.023330654054535903, "acc_norm": 0.7116402116402116, "acc_norm_stderr": 0.023330654054535903 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04360314860077459, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9096774193548387, "acc_stderr": 0.016306570644488313, "acc_norm": 0.9096774193548387, "acc_norm_stderr": 0.016306570644488313 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.645320197044335, "acc_stderr": 0.03366124489051449, "acc_norm": 0.645320197044335, "acc_norm_stderr": 0.03366124489051449 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8666666666666667, "acc_stderr": 0.026544435312706456, "acc_norm": 0.8666666666666667, "acc_norm_stderr": 0.026544435312706456 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9191919191919192, "acc_stderr": 0.019417681889724536, "acc_norm": 0.9191919191919192, "acc_norm_stderr": 0.019417681889724536 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527033, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527033 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8153846153846154, "acc_stderr": 0.01967163241310029, "acc_norm": 0.8153846153846154, "acc_norm_stderr": 0.01967163241310029 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.45185185185185184, "acc_stderr": 0.030343862998512623, "acc_norm": 0.45185185185185184, "acc_norm_stderr": 0.030343862998512623 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02476290267805791, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02476290267805791 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5099337748344371, "acc_stderr": 0.04081677107248437, "acc_norm": 0.5099337748344371, "acc_norm_stderr": 0.04081677107248437 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9137614678899083, "acc_stderr": 0.012035597300116245, "acc_norm": 0.9137614678899083, "acc_norm_stderr": 0.012035597300116245 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0321495214780275, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0321495214780275 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9071729957805907, "acc_stderr": 0.01888975055095671, "acc_norm": 0.9071729957805907, "acc_norm_stderr": 0.01888975055095671 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.02693611191280226, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.02693611191280226 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540637, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540637 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8981481481481481, "acc_stderr": 0.02923927267563275, "acc_norm": 0.8981481481481481, "acc_norm_stderr": 0.02923927267563275 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.026845765054553838, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.026845765054553838 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8737864077669902, "acc_stderr": 0.03288180278808628, "acc_norm": 0.8737864077669902, "acc_norm_stderr": 0.03288180278808628 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9444444444444444, "acc_stderr": 0.015006312806446912, "acc_norm": 0.9444444444444444, "acc_norm_stderr": 0.015006312806446912 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9042145593869731, "acc_stderr": 0.01052403107905584, "acc_norm": 0.9042145593869731, "acc_norm_stderr": 0.01052403107905584 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8121387283236994, "acc_stderr": 0.021029269752423203, "acc_norm": 0.8121387283236994, "acc_norm_stderr": 0.021029269752423203 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7966480446927374, "acc_stderr": 0.013461351487507524, "acc_norm": 0.7966480446927374, "acc_norm_stderr": 0.013461351487507524 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8562091503267973, "acc_stderr": 0.020091188936043718, "acc_norm": 0.8562091503267973, "acc_norm_stderr": 0.020091188936043718 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8167202572347267, "acc_stderr": 0.02197419884826582, "acc_norm": 0.8167202572347267, "acc_norm_stderr": 0.02197419884826582 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8672839506172839, "acc_stderr": 0.018877353839571842, "acc_norm": 0.8672839506172839, "acc_norm_stderr": 0.018877353839571842 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6276595744680851, "acc_stderr": 0.02883892147125145, "acc_norm": 0.6276595744680851, "acc_norm_stderr": 0.02883892147125145 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5808344198174706, "acc_stderr": 0.012602244505788228, "acc_norm": 0.5808344198174706, "acc_norm_stderr": 0.012602244505788228 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8272058823529411, "acc_stderr": 0.022966067585581774, "acc_norm": 0.8272058823529411, "acc_norm_stderr": 0.022966067585581774 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.815359477124183, "acc_stderr": 0.01569702924075778, "acc_norm": 0.815359477124183, "acc_norm_stderr": 0.01569702924075778 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8285714285714286, "acc_stderr": 0.02412746346265015, "acc_norm": 0.8285714285714286, "acc_norm_stderr": 0.02412746346265015 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.021166216304659393, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.021166216304659393 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8947368421052632, "acc_stderr": 0.023537557657892567, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.023537557657892567 }, "harness|truthfulqa:mc|0": { "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7210377690522727, "mc2_stderr": 0.014187472355015407 }, "harness|winogrande|5": { "acc": 0.8279400157853196, "acc_stderr": 0.010607731615247001 }, "harness|gsm8k|5": { "acc": 0.6004548900682335, "acc_stderr": 0.013491660298815988 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
zolak/twitter_dataset_50_1713158338
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 335468 num_examples: 819 download_size: 168190 dataset_size: 335468 configs: - config_name: default data_files: - split: train path: data/train-* ---
Lyn4ever29/GuwenEE
--- license: cc-by-4.0 ---
Taeyoung/helloworld
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 2080649.0 num_examples: 6 download_size: 0 dataset_size: 2080649.0 --- # Dataset Card for "helloworld" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Seongill/Trivia_5_only_adversary_1159
--- dataset_info: features: - name: question dtype: string - name: answers sequence: string - name: has_answer dtype: bool - name: similar_sub dtype: string - name: ctxs list: - name: answer_sent sequence: string - name: hasanswer dtype: bool - name: id dtype: string - name: is_adv dtype: bool - name: new_answer_sent dtype: string - name: original_text dtype: string - name: score dtype: float64 - name: text dtype: string - name: title dtype: string - name: num_advs dtype: int64 splits: - name: train num_bytes: 7404322 num_examples: 1159 download_size: 3197268 dataset_size: 7404322 configs: - config_name: default data_files: - split: train path: data/train-* ---
Guy2/AirportSecurity
--- license: cc-by-nc-nd-3.0 ---
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-27000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1086901 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
projecte-aina/MuST-SHE_en-ca
--- license: cc-by-nc-nd-4.0 task_categories: - translation language: - en - ca tags: - bias - gender bias - evaluation --- # Dataset Card for MuST-SHE_en-ca ## Dataset Description - **Point of Contact:** langtech@bsc.es ### Dataset Summary MuST-SHE_en-ca is an English-Catalan evaluation dataset of **1.046** examples, created to support evaluation of Catalan NLP tasks, specifically Gender Bias evaluation in Machine Translation. This dataset is derived from MuST-SHE English-Spanish, by translating the Spanish portion into Catalan. For more information about the original MuST-SHE dataset: https://mt.fbk.eu/must-she/ ### Supported Tasks and Leaderboards The dataset has been designed for the evaluation of gender bias in Machine Translation from English to Catalan. As this is a corpus based on natural language it allows for different insights to popular template based gender bias evaluation sets. ### Languages The languages included in the dataset are English (EN) and Catalan (CA). ## Dataset Structure ### Data Instances The dataset is composed of a single tsv file containing 1.046 rows. - MuST-SHE_en-ca.tsv The dataset follows the structure of the original MuST-SHE dataset. The majority of the data fields have not been altered in any way. The only data fields which have been changed are those relating to the translation from Spanish to Catalan, namely: - LANG (es -> ca) - REF (es -> ca) - WRONG-REF (es -> ca) - GENDERTERMS (es -> ca) The original dataset contains the data field CATEGORY, dividing the segments into four categories based on the presence or absense of gender information. However, as the original dataset is designed for the evaluation of Speech Data, the segments in which gender information is present in the audio can be considered to contain no gender information in the context of text-based Machine Translation. We therefore added the extra column "TEXT-CATEGORY", specifically meant for textual Machine Translation tasks. In "TEXT-CATEGORY", instances are classified in two distinct categories: * sentences in which the text contains sufficient information to disambiguate gender. * sentences in which the text does not contain sufficient information to disambiguate gender. ### Data fields The data fields follow the structure of the original MuST-SHE dataset. - ID - LANG - TALK - MuSTC-v1.0-SET - SRC - REF - WRONG-REF - SPEAKER - GENDER - CATEGORY - TEXT-CATEGORY - COMMENT - FREE-REF - GENDERTERMS More information about these datafields can be found in the [MuST-SHE dataset card](https://mt.fbk.eu/data-statement-for-must-she/). ### Data Splits The dataset contains a single split for evaluation. ### Curation Rationale This dataset is aimed at evaluating Gender Bias in Machine Translation from English to Catalan, in order to promote fairer MT outputs when translating from a gender-neutral language, such as English, into a grammatically gendered language, such as Catalan. ## Source Data #### Initial Data Collection and Normalization The original [MuST-SHE](https://mt.fbk.eu/must-she/) dataset is a subset of the TED-based [MuST-C corpus](https://mt.fbk.eu/must-c/). MuST-SHE_en-ca was created by automatically translating the Spanish components of the English-Spanish MuST-SHE using the [PlanTL Project's Spanish-Catalan machine translation model](https://huggingface.co/PlanTL-GOB-ES/mt-plantl-es-ca). Gender terms were extracted automatically and both the gender terms and the automatically translated sentences were then extensively reviewed by a native Catalan speaker to ensure accuracy. #### Who are the source data producers? [Machine Translation group at Fondazione Bruno Kessler](https://mt.fbk.eu/) ### Annotations #### Annotation process For each segment, we added an extra column "TEXT-CATEGORY", specifically meant for textual Machine Translation tasks. In "TEXT-CATEGORY", segments are classified in two distinct categories: * sentences in which the text contains sufficient information to disambiguate gender. * sentences in which the text does not contain sufficient information to disambiguate gender. All translations from Spanish were automatically generated using the [PlanTL es->ca model](https://huggingface.co/PlanTL-GOB-ES/mt-plantl-es-ca) and manually revised by a native Catalan speaker. * Information about the original annotation process of MuST-SHE can be found in the [MuST-SHE dataset card](https://mt.fbk.eu/data-statement-for-must-she/). #### Who are the annotators? The annotation was done internally by BSC LangTech collaborators. ### Personal and Sensitive Information No anonymisation process was performed. ## Considerations for Using the Data ### Social Impact of Dataset The specific purpose of this dataset is to help evaluating the Gender Bias of Machines Translation engines when translating from a gender-neutral language, such as English, into a grammatically gendered language, such as Catalan. Such evaluation may contribute to promote fairer MT outputs in Catalan when translating from English. At a broad level, by providing this resource, we intend to promote the use of Catalan across NLP tasks, thereby improving the accessibility and visibility of the Catalan language. ### Discussion of Biases This dataset has been specifically designed to assess Gender Bias in Machine Translation. Inherent biases of other types (such as racial, ethnic, socio-economic bias, etc.) may be present in the data. No specific mitigation strategies for these other types of bias have been applied to this dataset. ### Other Known Limitations The dataset contains data of a general domain. Applications of this dataset in more specific domains such as biomedical, legal etc. would be of limited use. ## Additional Information ### Dataset Curators Language Technologies Unit at the Barcelona Supercomputing Center (langtech@bsc.es). This work has been promoted and financed by the Generalitat de Catalunya through the [Aina project](https://projecteaina.cat/). ### Licensing Information This work is licensed under a [Creative Commons Attribution-NonCommercial-NoDerivs 4.0](https://creativecommons.org/licenses/by-nc-nd/4.0/). ### Citation Information ``` @inproceedings{bentivogli-etal-2020-gender, title = "Gender in Danger? Evaluating Speech Translation Technology on the {M}u{ST}-{SHE} Corpus", author = "Bentivogli, Luisa and Savoldi, Beatrice and Negri, Matteo and Di Gangi, Mattia A. and Cattoni, Roldano and Turchi, Marco", editor = "Jurafsky, Dan and Chai, Joyce and Schluter, Natalie and Tetreault, Joel", booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics", month = jul, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2020.acl-main.619", doi = "10.18653/v1/2020.acl-main.619", pages = "6923--6933", abstract = "Translating from languages without productive grammatical gender like English into gender-marked languages is a well-known difficulty for machines. This difficulty is also due to the fact that the training data on which models are built typically reflect the asymmetries of natural languages, gender bias included. Exclusively fed with textual data, machine translation is intrinsically constrained by the fact that the input sentence does not always contain clues about the gender identity of the referred human entities. But what happens with speech translation, where the input is an audio signal? Can audio provide additional information to reduce gender bias? We present the first thorough investigation of gender bias in speech translation, contributing with: i) the release of a benchmark useful for future studies, and ii) the comparison of different technologies (cascade and end-to-end) on two language directions (English-Italian/French).", } ``` ``` @article{article, author = {Cattoni, Roldano and Di Gangi, Mattia and Bentivogli, Luisa and Negri, Matteo and Turchi, Marco}, year = {2021}, month = {03}, pages = {101155}, title = {MuST-C: A multilingual corpus for end-to-end speech translation}, volume = {66}, journal = {Computer Speech & Language}, doi = {10.1016/j.csl.2020.101155} } ``` ### Contributions [N/A]
liuyanchen1015/MULTI_VALUE_mnli_completive_done
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 290459 num_examples: 1226 - name: dev_mismatched num_bytes: 377910 num_examples: 1509 - name: test_matched num_bytes: 296760 num_examples: 1199 - name: test_mismatched num_bytes: 380324 num_examples: 1541 - name: train num_bytes: 11563230 num_examples: 48515 download_size: 8113679 dataset_size: 12908683 --- # Dataset Card for "MULTI_VALUE_mnli_completive_done" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mteb/scala_da_classification
--- dataset_info: features: - name: text dtype: string - name: corruption_type dtype: string - name: label dtype: string splits: - name: train num_bytes: 139194 num_examples: 1024 - name: test num_bytes: 281517 num_examples: 2048 - name: full_train num_bytes: 733506 num_examples: 5342 - name: val num_bytes: 32942 num_examples: 256 download_size: 700593 dataset_size: 1187159 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: full_train path: data/full_train-* - split: val path: data/val-* ---
nirantk/chaii-hindi-and-tamil-question-answering
--- task_categories: - question-answering language: - hi - ta pretty_name: Chaii Hindi and Tamil Question Answerin size_categories: - 10K<n<100K ---
nourheshamshaheen/final_chart_to_table
--- dataset_info: features: - name: image dtype: image - name: text dtype: string - name: type dtype: string splits: - name: train num_bytes: 101059151.385 num_examples: 2245 - name: test num_bytes: 25058843.0 num_examples: 562 download_size: 108890579 dataset_size: 126117994.385 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "final_chart_to_table" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
nojiyoon/shilla-clothing-text-and-image-dataset
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: caption dtype: string splits: - name: train num_bytes: 79876939.0 num_examples: 207 download_size: 79818809 dataset_size: 79876939.0 --- # Dataset Card for "shilla-clothing-text-and-image-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/muramatsu_sakura_idolmastercinderellagirls
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of muramatsu_sakura/村松さくら/무라마츠사쿠라 (THE iDOLM@STER: Cinderella Girls) This is the dataset of muramatsu_sakura/村松さくら/무라마츠사쿠라 (THE iDOLM@STER: Cinderella Girls), containing 75 images and their tags. The core tags of this character are `brown_hair, twintails, hairband, short_twintails, bow, pink_eyes, short_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 75 | 55.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/muramatsu_sakura_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 75 | 41.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/muramatsu_sakura_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 155 | 80.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/muramatsu_sakura_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 75 | 53.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/muramatsu_sakura_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 155 | 103.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/muramatsu_sakura_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/muramatsu_sakura_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, smile, solo, open_mouth, blush, looking_at_viewer, necklace, one_eye_closed, skirt, hair_ornament | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | blush, white_shirt, 1girl, open_mouth, short_sleeves, :d, collared_shirt, low_twintails, school_uniform, simple_background, upper_body, white_background, bangs, bow_hairband, looking_at_viewer, red_bowtie, solo_focus, striped_bowtie, sweater_vest | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | open_mouth | blush | looking_at_viewer | necklace | one_eye_closed | skirt | hair_ornament | white_shirt | short_sleeves | :d | collared_shirt | low_twintails | school_uniform | simple_background | upper_body | white_background | bangs | bow_hairband | red_bowtie | solo_focus | striped_bowtie | sweater_vest | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:--------|:--------------------|:-----------|:-----------------|:--------|:----------------|:--------------|:----------------|:-----|:-----------------|:----------------|:-----------------|:--------------------|:-------------|:-------------------|:--------|:---------------|:-------------|:-------------|:-----------------|:---------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
Ti-Ma/wikipedia_2017
--- license: cc-by-sa-3.0 ---
anthonny/hate_speech
--- annotations_creators: - found language_creators: - crowdsourced language: - es-EC license: - unknown multilinguality: - monolingual pretty_name: hate speech size_categories: - unknown source_datasets: - original task_categories: - text-classification task_ids: - semantic-similarity-classification ---
nikchar/claim_verification_training_set_evidence
--- dataset_info: features: - name: id dtype: string - name: text dtype: string - name: lines dtype: string splits: - name: train num_bytes: 89974716 num_examples: 58850 download_size: 52193086 dataset_size: 89974716 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "20k_claims_evidence" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base
--- pretty_name: Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LoSboccacc/orthogonal-2x7B-v2-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-19T00:22:47.991764](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base/blob/main/results_2024-01-19T00-22-47.991764.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6277587211591188,\n\ \ \"acc_stderr\": 0.032809227406765666,\n \"acc_norm\": 0.6311375541821904,\n\ \ \"acc_norm_stderr\": 0.03346155242000715,\n \"mc1\": 0.5042839657282742,\n\ \ \"mc1_stderr\": 0.01750285857737127,\n \"mc2\": 0.6680405227339233,\n\ \ \"mc2_stderr\": 0.01518478311873851\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\ \ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817834\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n\ \ \"acc_stderr\": 0.004689685978155166,\n \"acc_norm\": 0.8569010157339175,\n\ \ \"acc_norm_stderr\": 0.0034945810763985373\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\ \ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\ \ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\ \ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\ \ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\ \ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\ \ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\ \ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\ \ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\ acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"\ acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\ acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \ \ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \ \ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.818348623853211,\n \"acc_stderr\": 0.016530617409266847,\n \"\ acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266847\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\ acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\ acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\ \ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\ \ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\ \ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\ acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\ \ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\ \ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\ \ \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n\ \ \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\ \ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\ \ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n\ \ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046623,\n\ \ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046623\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\ \ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\ \ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\ \ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\ : {\n \"acc\": 0.4471968709256845,\n \"acc_stderr\": 0.012698825252435108,\n\ \ \"acc_norm\": 0.4471968709256845,\n \"acc_norm_stderr\": 0.012698825252435108\n\ \ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\ : 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"\ acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\ \ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\ \ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\ \ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\ \ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\ \ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n\ \ \"mc1_stderr\": 0.01750285857737127,\n \"mc2\": 0.6680405227339233,\n\ \ \"mc2_stderr\": 0.01518478311873851\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698336\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5140257771038665,\n \ \ \"acc_stderr\": 0.013767064940239285\n }\n}\n```" repo_url: https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|arc:challenge|25_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-19T00-22-47.991764.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|gsm8k|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hellaswag|10_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|truthfulqa:mc|0_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-19T00-22-47.991764.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_19T00_22_47.991764 path: - '**/details_harness|winogrande|5_2024-01-19T00-22-47.991764.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-19T00-22-47.991764.parquet' - config_name: results data_files: - split: 2024_01_19T00_22_47.991764 path: - results_2024-01-19T00-22-47.991764.parquet - split: latest path: - results_2024-01-19T00-22-47.991764.parquet --- # Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-v2-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T00:22:47.991764](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base/blob/main/results_2024-01-19T00-22-47.991764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6277587211591188, "acc_stderr": 0.032809227406765666, "acc_norm": 0.6311375541821904, "acc_norm_stderr": 0.03346155242000715, "mc1": 0.5042839657282742, "mc1_stderr": 0.01750285857737127, "mc2": 0.6680405227339233, "mc2_stderr": 0.01518478311873851 }, "harness|arc:challenge|25": { "acc": 0.6237201365187713, "acc_stderr": 0.014157022555407163, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.013752062419817834 }, "harness|hellaswag|10": { "acc": 0.6707827126070504, "acc_stderr": 0.004689685978155166, "acc_norm": 0.8569010157339175, "acc_norm_stderr": 0.0034945810763985373 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411018, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.046306532033665956, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.046306532033665956 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.02672949906834996, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.02672949906834996 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.02486499515976775, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.02486499515976775 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.818348623853211, "acc_stderr": 0.016530617409266847, "acc_norm": 0.818348623853211, "acc_norm_stderr": 0.016530617409266847 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967408, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967408 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389087, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389087 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.035208939510976534, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.035208939510976534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.708092485549133, "acc_stderr": 0.024476994076247333, "acc_norm": 0.708092485549133, "acc_norm_stderr": 0.024476994076247333 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43687150837988825, "acc_stderr": 0.01658868086453063, "acc_norm": 0.43687150837988825, "acc_norm_stderr": 0.01658868086453063 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.696078431372549, "acc_stderr": 0.026336613469046623, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.026336613469046623 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4471968709256845, "acc_stderr": 0.012698825252435108, "acc_norm": 0.4471968709256845, "acc_norm_stderr": 0.012698825252435108 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6544117647058824, "acc_stderr": 0.028888193103988633, "acc_norm": 0.6544117647058824, "acc_norm_stderr": 0.028888193103988633 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979033, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979033 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.5042839657282742, "mc1_stderr": 0.01750285857737127, "mc2": 0.6680405227339233, "mc2_stderr": 0.01518478311873851 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698336 }, "harness|gsm8k|5": { "acc": 0.5140257771038665, "acc_stderr": 0.013767064940239285 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
joey234/mmlu-medical_genetics-original-neg
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D splits: - name: test num_bytes: 2502.24 num_examples: 12 download_size: 4137 dataset_size: 2502.24 --- # Dataset Card for "mmlu-medical_genetics-original-neg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
KAUE2006/FofaoOrivalPessiniVozAntiga
--- license: openrail ---
fernandoperes/py_legislation
--- language: - es license: apache-2.0 size_categories: - 1K<n<10K task_categories: - text-classification tags: - legal configs: - config_name: default data_files: - split: train path: "/raw_text/train.parquet" - config_name: raw_text data_files: - split: train path: "/raw_text/train.parquet" - config_name: unlabeled_sentences data_files: - split: train path: "/unlabeled_sentences/train.parquet" dataset_info: - config_name: raw_text features: - name: source_id dtype: int64 - name: source_name dtype: string - name: text dtype: string - name: text_id dtype: int64 - name: extension dtype: class_label: names: '0': docx '1': pdf '2': html '3': txt '4': doc split: train - config_name: unlabeled_sentences features: - name: source_id dtype: int64 - name: source_name dtype: string - name: text dtype: string - name: text_id dtype: int64 - name: cost_type dtype: class_label: names: '0': no_cost '1': adm_cost '2': direct_cost '3': other_cost - name: affected_entity dtype: class_label: names: '0': no_affected_ent '1': companies '2': citizens '3': public_adm - name: io_categories sequence: class_label: names: '0': prestacao_info_empresarial_e_fiscal '1': pedidos_de_licencas_e_outros '2': registos_e_notificacoes '3': candidatura_a_subsidios_e_outros '4': disponibilizacao_de_manuais_e_outros '5': cooperacao_com_auditorias_e_outros '6': prestacao_info_a_consumidores '7': outras_ois - name: aa_categories sequence: class_label: names: '0': aa_1_familiarizacao_com_oi '1': aa_1_recolha_e_organizacao_de_info '2': aa_1_processamento_de_info '3': aa_1_tempos_de_espera '4': aa_1_deslocacoes '5': aa_1_submissao_de_info '6': aa_1_preservacao_de_info '7': aa_2_familiarizacao_com_oi '8': aa_2_recolha_e_organizacao_de_info '9': aa_2_processamento_de_info '10': aa_2_tempos_de_espera '11': aa_2_deslocacoes '12': aa_2_submissao_de_info '13': aa_2_preservacao_de_info '14': aa_3_familiarizacao_com_oi '15': aa_3_recolha_e_organizacao_de_info '16': aa_3_processamento_de_info '17': aa_3_tempos_de_espera '18': aa_3_deslocacoes '19': aa_3_submissao_de_info '20': aa_3_preservacao_de_info '21': aa_4_familiarizacao_com_oi '22': aa_4_recolha_e_organizacao_de_info '23': aa_4_processamento_de_info '24': aa_4_tempos_de_espera '25': aa_4_deslocacoes '26': aa_4_submissao_de_info '27': aa_4_preservacao_de_info '28': aa_5_familiarizacao_com_oi '29': aa_5_recolha_e_organizacao_de_info '30': aa_5_processamento_de_info '31': aa_5_tempos_de_espera '32': aa_5_deslocacoes '33': aa_5_submissao_de_info '34': aa_5_preservacao_de_info '35': aa_6_familiarizacao_com_oi '36': aa_6_recolha_e_organizacao_de_info '37': aa_6_processamento_de_info '38': aa_6_tempos_de_espera '39': aa_6_deslocacoes '40': aa_6_submissao_de_info '41': aa_6_preservacao_de_info '42': aa_7_familiarizacao_com_oi '43': aa_7_recolha_e_organizacao_de_info '44': aa_7_processamento_de_info '45': aa_7_tempos_de_espera '46': aa_7_deslocacoes '47': aa_7_submissao_de_info '48': aa_7_preservacao_de_info - name: aa_categories_unique sequence: class_label: names: '0': familiarizacao_com_oi '1': recolha_e_organizacao_de_info '2': processamento_de_info '3': tempos_de_espera '4': deslocacoes '5': submissao_de_info '6': preservacao_de_info splits: - name: train --- # Paraguay Legislation The Paraguay Legislation dataset is a comprehensive collection of legal documents sourced from the legislative framework of Paraguay. The dataset contains legal documents sourced from the legislative framework of Paraguay, including resolutions, decrees, laws, and other kinds of legislative texts. This dataset has been curated as a valuable resource for Natural Language Processing (NLP) tasks. The data is designed for research focused on text classification tasks. The classification process is divided into two objectives: 1. Binary classification: 0 - no cost and 1 - cost (legislation has costs for the society) 2. Multi-classification: classify the document into several hierarchical categories of costs. For more information about multi-classification definitions, please check this link: <todo: link to>. ## Subsets The dataset contains various subsets, each representing different data quality and preparation stages. Within these subsets, you'll encounter multiple versions of the same data, with variations primarily reflecting differences in data quality, metadata columns, and preprocessing tasks applied to change the data. The subsets are the following: **1. Raw:** Data extracted from the sources files (URls, PDFs and Word files) without any transformation or sentence splitter. It can be helpful because you can access the raw data extracted from the seeds (PDFs and Word files) and apply other preprocessing tasks from this point to prepare the data without returning to extract texts from source files. **2. Sentences:** Normalized data split by sentence, mainly treating issues of text extracted from PDF. This stage also adds metadata about the sentence, for example: if it is a title or not. **3. Sentence Unlabeled:** Unlabeled corpora of Paraguay legislation. This data is prepared to be labeled by the experts. Each instance of the dataset represents a specific text passage, split by its original formatting extracted from raw text (from original documents). **4. Sentence labeled (Ground Truth):** The labeled data is the ground truth data used to train the models. This data is annotated by legal experts indicating the existence of administrative costs (and other types) in the legislation. Each instance of the dataset represents a specific text passage. This dataset has the following data splits: * Training Set: This portion of the data is used to train and fine-tune machine learning models. * Test Set: The test set is reserved for assessing the model's accuracy, generalization, and effectiveness. It remains unseen during training and helps gauge how well the model performs on new, unseen data. Together, these labeled data subsets provide a crucial reference point for building and evaluating models, ensuring they can make informed predictions and classifications with high accuracy and reliability.
theosun/chinese_articles
--- task_categories: - text-generation language: - zh pretty_name: Chinese Articles ---
lshowway/Wikipedia_5gram_more_orders
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 3729298637 num_examples: 1894957 download_size: 2399612708 dataset_size: 3729298637 --- # Dataset Card for "Wikipedia_5gram_more_orders" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)