datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
frankier/processed_multiscale_rt_critics
--- dataset_info: features: - name: movie_title dtype: string - name: publisher_name dtype: string - name: critic_name dtype: string - name: review_content dtype: string - name: review_score dtype: string - name: grade_type dtype: string - name: orig_num dtype: float32 - name: orig_denom dtype: float32 - name: includes_zero dtype: bool - name: label dtype: uint8 - name: scale_points dtype: uint8 - name: multiplier dtype: uint8 - name: group_id dtype: uint32 splits: - name: train num_bytes: 117244343 num_examples: 540256 - name: test num_bytes: 28517095 num_examples: 131563 download_size: 0 dataset_size: 145761438 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "processed_multiscale_rt_critics" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Kuartyh/Sasa
--- license: openrail ---
Mutonix/Vript-RR
--- task_categories: - video-classification - visual-question-answering language: - en size_categories: - n<1K --- # 🎬 Vript: Refine Video Captioning into Video Scripting --- # Vript-RR (Retrieve then Reason) A video reasoning benchmark by first giving a detailed description of the scene as a hint and then asking questions about details in the scene. <p align="center"> <img src="assets/Vript-RR_01.png" width="800"> </p> <p align="center"> <img src="assets/Vript-RR_00.png" width="800"> </p> ## Getting Started **By downloading these datasets, you agree to the terms of the [License](#License).** ``` Vript-RR/ | β”œβ”€β”€ RR_videos.zip β”‚ β”œβ”€β”€ -_MRAAhEKio.mp4 β”‚ └── ... β”‚ β”œβ”€β”€ RR_scenes.zip β”‚ β”œβ”€β”€ -_MRAAhEKio-Scene-010.mp4 β”‚ └── ... β”‚ β”œβ”€β”€ RR_annotations β”‚ β”œβ”€β”€ -_MRAAhEKio-Scene-010_RR.json β”‚ └── ... β”‚ └── RR_annotations.jsonl ``` - `RR_videos.zip`: The untrimmed videos in the Vript-RR benchmark. - `RR_scenes.zip`: The trimmed video clips in the Vript-RR benchmark, which correspond to scenes in the `RR_annotations`. - `RR_annotations`: The annotations of the Vript-RR benchmark. The `RR_annotations.jsonl` file contains all the annotations in the Vript-RR benchmark, which can be previewed in the [Vript-RR](https://huggingface.co/datasets/Mutonix/Vript-RR) on Huggingface. ## License By downloading or using the data or model, you understand, acknowledge, and agree to all the terms in the following agreement. - ACADEMIC USE ONLY Any content from Vript/Vript-Bench dataset and Vriptor model is available for academic research purposes only. You agree not to reproduce, duplicate, copy, trade, or exploit for any commercial purposes - NO DISTRIBUTION Respect the privacy of personal information of the original source. Without the permission of the copyright owner, you are not allowed to perform any form of broadcasting, modification or any other similar behavior to the data set content. - RESTRICTION AND LIMITATION OF LIABILITY In no event shall we be liable for any other damages whatsoever arising out of the use of, or inability to use this dataset and its associated software, even if we have been advised of the possibility of such damages. - DISCLAIMER You are solely responsible for legal liability arising from your improper use of the dataset content. We reserve the right to terminate your access to the dataset at any time. You should delete the Vript/Vript-Bench dataset or Vriptor model if required. This license is modified from the [HD-VG-100M](https://github.com/daooshee/HD-VG-130M) license. <!-- ## Citation ``` ``` --> ## Contact **Dongjie Yang**: [djyang.tony@sjtu.edu.cn](djyang.tony@sjtu.edu.cn)
scene-genie/ARCHIVED-instagram-dataset-train
--- dataset_info: features: - name: caption dtype: string - name: touched_image dtype: image - name: untouched_image dtype: image - name: resized_touched_image dtype: image - name: resized_untouched_image dtype: image - name: blank_prompt dtype: string - name: prompt dtype: string splits: - name: train num_bytes: 7616632048.0 num_examples: 8660 download_size: 7588171166 dataset_size: 7616632048.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
thangvip/cosmopedia_vi_math
--- dataset_info: features: - name: prompt dtype: string - name: text_token_length dtype: int64 - name: text dtype: string - name: seed_data dtype: string - name: format dtype: string - name: audience dtype: string - name: vi_text dtype: string splits: - name: 0_set num_bytes: 7947999 num_examples: 1000 - name: 1_set num_bytes: 7922368 num_examples: 1000 - name: 2_set num_bytes: 7922478 num_examples: 1000 - name: 3_set num_bytes: 7940343 num_examples: 1000 - name: 4_set num_bytes: 7972359 num_examples: 1000 - name: 5_set num_bytes: 7857383 num_examples: 1000 - name: 6_set num_bytes: 7850101 num_examples: 1000 - name: 7_set num_bytes: 7883159 num_examples: 1000 - name: 8_set num_bytes: 7940395 num_examples: 1000 - name: 9_set num_bytes: 7906911 num_examples: 1000 - name: 10_set num_bytes: 7930300 num_examples: 1000 - name: 11_set num_bytes: 7873006 num_examples: 1000 - name: 12_set num_bytes: 7879820 num_examples: 1000 - name: 13_set num_bytes: 7918895 num_examples: 1000 - name: 14_set num_bytes: 7955010 num_examples: 1000 - name: 15_set num_bytes: 7816090 num_examples: 1000 - name: 16_set num_bytes: 5848354 num_examples: 1000 - name: 17_set num_bytes: 7882760 num_examples: 1000 - name: 18_set num_bytes: 7964555 num_examples: 1000 download_size: 73448237 dataset_size: 148212286 configs: - config_name: default data_files: - split: 0_set path: data/0_set-* - split: 1_set path: data/1_set-* - split: 2_set path: data/2_set-* - split: 3_set path: data/3_set-* - split: 4_set path: data/4_set-* - split: 5_set path: data/5_set-* - split: 6_set path: data/6_set-* - split: 7_set path: data/7_set-* - split: 8_set path: data/8_set-* - split: 9_set path: data/9_set-* - split: 10_set path: data/10_set-* - split: 11_set path: data/11_set-* - split: 12_set path: data/12_set-* - split: 13_set path: data/13_set-* - split: 14_set path: data/14_set-* - split: 15_set path: data/15_set-* - split: 16_set path: data/16_set-* - split: 17_set path: data/17_set-* - split: 18_set path: data/18_set-* ---
chenbowen-184/events-marketing-sample
--- dataset_info: features: - name: event dtype: string - name: description dtype: string - name: marketing_email dtype: string splits: - name: train num_bytes: 3527 num_examples: 3 download_size: 11212 dataset_size: 3527 --- # Dataset Card for "events-marketing-sample" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
TempoFunk/webvid-10M
--- license: agpl-3.0 task_categories: - text-to-video - text-to-image - video-classification - image-classification language: - en size_categories: - 1M<n<10M ---
fathyshalab/massive_datetime
--- dataset_info: features: - name: id dtype: string - name: label dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 21237 num_examples: 402 - name: validation num_bytes: 3849 num_examples: 73 - name: test num_bytes: 5464 num_examples: 103 download_size: 18224 dataset_size: 30550 --- # Dataset Card for "massive_datetime" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3
--- pretty_name: Evaluation run of wei123602/llama-13b-FINETUNE3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T20:56:48.132337](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-10-25T20-56-48.132337.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10360738255033557,\n\ \ \"em_stderr\": 0.003120930790921416,\n \"f1\": 0.14798552852348912,\n\ \ \"f1_stderr\": 0.003214007613815376,\n \"acc\": 0.4442352766589695,\n\ \ \"acc_stderr\": 0.010435544785566055\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.10360738255033557,\n \"em_stderr\": 0.003120930790921416,\n\ \ \"f1\": 0.14798552852348912,\n \"f1_stderr\": 0.003214007613815376\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \ \ \"acc_stderr\": 0.00899288849727557\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.01187820107385654\n\ \ }\n}\n```" repo_url: https://huggingface.co/wei123602/llama-13b-FINETUNE3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T20_56_48.132337 path: - '**/details_harness|drop|3_2023-10-25T20-56-48.132337.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T20-56-48.132337.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T20_56_48.132337 path: - '**/details_harness|gsm8k|5_2023-10-25T20-56-48.132337.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T20-56-48.132337.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_13T02_24_38.254919 path: - '**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T20_56_48.132337 path: - '**/details_harness|winogrande|5_2023-10-25T20-56-48.132337.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T20-56-48.132337.parquet' - config_name: results data_files: - split: 2023_09_13T02_24_38.254919 path: - results_2023-09-13T02-24-38.254919.parquet - split: 2023_10_25T20_56_48.132337 path: - results_2023-10-25T20-56-48.132337.parquet - split: latest path: - results_2023-10-25T20-56-48.132337.parquet --- # Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/wei123602/llama-13b-FINETUNE3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T20:56:48.132337](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-10-25T20-56-48.132337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.10360738255033557, "em_stderr": 0.003120930790921416, "f1": 0.14798552852348912, "f1_stderr": 0.003214007613815376, "acc": 0.4442352766589695, "acc_stderr": 0.010435544785566055 }, "harness|drop|3": { "em": 0.10360738255033557, "em_stderr": 0.003120930790921416, "f1": 0.14798552852348912, "f1_stderr": 0.003214007613815376 }, "harness|gsm8k|5": { "acc": 0.12130401819560273, "acc_stderr": 0.00899288849727557 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.01187820107385654 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
aisuko/emnlp2016_2018
--- license: apache-2.0 language: - en --- Only for researching usage. The papers download from the https://sbert.net/datasets/emnlp2016-2018.json
samiesam/snd_eng
--- license: apache-2.0 task_categories: - translation language: - en - sd tags: - code pretty_name: eng_snd size_categories: - n<1K --- --- task_categories: - translation ## Dataset Description This dataset for project snd_to_eng. ### Languages The BCP-47 code for the dataset's language is unk. ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "feat_id": "Value(dtype='int64', id=None)", "feat_source_lang": "Value(dtype='string', id=None)", "feat_target_lang": "Value(dtype='string', id=None)", "source": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)" } ``
usvsnsp/duped-num-frequencies
--- dataset_info: features: - name: TokenID dtype: int64 - name: Frequency dtype: int64 splits: - name: memorized num_bytes: 960000 num_examples: 60000 - name: non_memorized num_bytes: 960000 num_examples: 60000 - name: total num_bytes: 960000 num_examples: 60000 download_size: 1965812 dataset_size: 2880000 --- # Dataset Card for "duped-num-frequencies" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vietgpt/OSCAR-2201
--- dataset_info: features: - name: id dtype: string - name: text dtype: string - name: url dtype: string - name: date dtype: string - name: perplexity dtype: float64 splits: - name: train num_bytes: 15978372237.047762 num_examples: 1700386 download_size: 6412125570 dataset_size: 15978372237.047762 --- # Dataset Card for "OSCAR-2201" Num tokens: 2,682,681,285 tokens
zhan1993/clusters_with_loss_scores
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: cluster_name dtype: string - name: expert_names sequence: string splits: - name: train num_bytes: 8962 num_examples: 10 download_size: 6121 dataset_size: 8962 --- # Dataset Card for "clusters_with_loss_scores" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mychen76/wildreceipts_ocr_test
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 53982770.0 num_examples: 452 download_size: 49734928 dataset_size: 53982770.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "wildreceipts_ocr_test" see train dataset for full detail: https://huggingface.co/datasets/mychen76/wildreceipts_ocr_train [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AdapterOcean/med_alpaca_standardized_cluster_70
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 80826316 num_examples: 8297 download_size: 23277762 dataset_size: 80826316 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_70" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Sachin7/story2_dataset
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 149165.10227272726 num_examples: 61 - name: test num_bytes: 66023.89772727272 num_examples: 27 download_size: 147376 dataset_size: 215189.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
davanstrien/haiku-preferences-test
--- size_categories: 1K<n<10K tags: - rlfh - argilla - human-feedback --- # Dataset Card for haiku-preferences-test This dataset has been created with [Argilla](https://docs.argilla.io). As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets). ## Dataset Description - **Homepage:** https://argilla.io - **Repository:** https://github.com/argilla-io/argilla - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset contains: * A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla. * Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`. * The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla. ### Load with Argilla To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code: ```python import argilla as rg ds = rg.FeedbackDataset.from_huggingface("davanstrien/haiku-preferences-test") ``` ### Load with `datasets` To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code: ```python from datasets import load_dataset ds = load_dataset("davanstrien/haiku-preferences-test") ``` ### Supported Tasks and Leaderboards This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure). There are no leaderboards associated with this dataset. ### Languages [More Information Needed] ## Dataset Structure ### Data in Argilla The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**. The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions. | Field Name | Title | Type | Required | Markdown | | ---------- | ----- | ---- | -------- | -------- | | text | Text | text | True | True | The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking. | Question Name | Title | Type | Required | Description | Values/Labels | | ------------- | ----- | ---- | -------- | ----------- | ------------- | | label | Do you like this haiku? | label_selection | True | N/A | ['Yes', 'No'] | The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata". The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`. | Metadata Name | Title | Type | Values | Visible for Annotators | | ------------- | ----- | ---- | ------ | ---------------------- | The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section. ### Data Instances An example of a dataset instance in Argilla looks as follows: ```json { "external_id": null, "fields": { "text": "Peaceful summit rests,\nSky\u0027s reflection in still lake,\nSilence whispers on." }, "metadata": { "generation_model": "mistralai/Mistral-7B-Instruct-v0.2", "prompt": "Can you compose a haiku about the serenity of mountain peaks?" }, "responses": [], "suggestions": [], "vectors": {} } ``` While the same record in HuggingFace `datasets` looks as follows: ```json { "external_id": null, "label": [], "label-suggestion": null, "label-suggestion-metadata": { "agent": null, "score": null, "type": null }, "metadata": "{\"prompt\": \"Can you compose a haiku about the serenity of mountain peaks?\", \"generation_model\": \"mistralai/Mistral-7B-Instruct-v0.2\"}", "text": "Peaceful summit rests,\nSky\u0027s reflection in still lake,\nSilence whispers on." } ``` ### Data Fields Among the dataset fields, we differentiate between the following: * **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions. * **text** is of type `text`. * **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`. * **label** is of type `label_selection` with the following allowed values ['Yes', 'No']. * **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable. * (optional) **label-suggestion** is of type `label_selection` with the following allowed values ['Yes', 'No']. Additionally, we also have two more fields that are optional and are the following: * **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`. * **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file. ### Data Splits The dataset contains a single split, which is `train`. ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation guidelines Do you like this haiku? Yes or no? A vibes only assessment is fine! #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
HarryAJMK418/EminemTTSDATA
--- license: openrail ---
open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1
--- pretty_name: Evaluation run of abacusai/Smaug-70B-v0.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [abacusai/Smaug-70B-v0.1](https://huggingface.co/abacusai/Smaug-70B-v0.1) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-03T05:35:28.928800](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1/blob/main/results_2024-02-03T05-35-28.928800.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7716613011645818,\n\ \ \"acc_stderr\": 0.02801089457302993,\n \"acc_norm\": 0.7734062646949216,\n\ \ \"acc_norm_stderr\": 0.028568963791437117,\n \"mc1\": 0.6560587515299877,\n\ \ \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n\ \ \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.735494880546075,\n \"acc_stderr\": 0.012889272949313371,\n\ \ \"acc_norm\": 0.7602389078498294,\n \"acc_norm_stderr\": 0.012476304127453944\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7199761003784106,\n\ \ \"acc_stderr\": 0.004480929450281562,\n \"acc_norm\": 0.8926508663612827,\n\ \ \"acc_norm_stderr\": 0.0030892396746331585\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\ \ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\ \ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n\ \ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\ \ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \ \ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.8452830188679246,\n \"acc_stderr\": 0.022257075558791282,\n\ \ \"acc_norm\": 0.8452830188679246,\n \"acc_norm_stderr\": 0.022257075558791282\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n\ \ \"acc_stderr\": 0.021257974822832048,\n \"acc_norm\": 0.9305555555555556,\n\ \ \"acc_norm_stderr\": 0.021257974822832048\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \ \ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\ \ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\ \ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n\ \ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n\ \ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838728,\n\ \ \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838728\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\ \ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n\ \ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n\ \ \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523864,\n \"\ acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523864\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\ \ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\ \ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\ \ \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n\ \ \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n\ \ \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\"\ : 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\ \ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"\ acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n\ \ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637282,\n\ \ \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637282\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \ \ \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\ \ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"\ acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.9357798165137615,\n \"acc_stderr\": 0.010510494713201403,\n \"\ acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.010510494713201403\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176853,\n \"\ acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176853\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\ acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \ \ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\ \ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\ \ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n\ \ \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\ acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\ \ \"acc_stderr\": 0.033432700628696195,\n \"acc_norm\": 0.8611111111111112,\n\ \ \"acc_norm_stderr\": 0.033432700628696195\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\ \ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\ \ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\ \ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\ \ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\ \ \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n\ \ \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n\ \ \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n\ \ \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n\ \ \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6960893854748603,\n\ \ \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.6960893854748603,\n\ \ \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n\ \ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n\ \ \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n\ \ \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n\ \ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \ \ \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6023468057366362,\n\ \ \"acc_stderr\": 0.012499840347460642,\n \"acc_norm\": 0.6023468057366362,\n\ \ \"acc_norm_stderr\": 0.012499840347460642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549473,\n\ \ \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549473\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757773,\n \ \ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757773\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\ \ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\ \ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007646,\n\ \ \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007646\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\ \ \"acc_stderr\": 0.021166216304659397,\n \"acc_norm\": 0.900497512437811,\n\ \ \"acc_norm_stderr\": 0.021166216304659397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \ \ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\ \ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\ \ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n\ \ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6560587515299877,\n\ \ \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n\ \ \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627305\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7869598180439727,\n \ \ \"acc_stderr\": 0.01127844785690078\n }\n}\n```" repo_url: https://huggingface.co/abacusai/Smaug-70B-v0.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|arc:challenge|25_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-03T05-35-28.928800.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|gsm8k|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hellaswag|10_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|truthfulqa:mc|0_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-03T05-35-28.928800.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_03T05_35_28.928800 path: - '**/details_harness|winogrande|5_2024-02-03T05-35-28.928800.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-03T05-35-28.928800.parquet' - config_name: results data_files: - split: 2024_02_03T05_35_28.928800 path: - results_2024-02-03T05-35-28.928800.parquet - split: latest path: - results_2024-02-03T05-35-28.928800.parquet --- # Dataset Card for Evaluation run of abacusai/Smaug-70B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abacusai/Smaug-70B-v0.1](https://huggingface.co/abacusai/Smaug-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-03T05:35:28.928800](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1/blob/main/results_2024-02-03T05-35-28.928800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7716613011645818, "acc_stderr": 0.02801089457302993, "acc_norm": 0.7734062646949216, "acc_norm_stderr": 0.028568963791437117, "mc1": 0.6560587515299877, "mc1_stderr": 0.016629087514276785, "mc2": 0.7666613083747418, "mc2_stderr": 0.014124410528709273 }, "harness|arc:challenge|25": { "acc": 0.735494880546075, "acc_stderr": 0.012889272949313371, "acc_norm": 0.7602389078498294, "acc_norm_stderr": 0.012476304127453944 }, "harness|hellaswag|10": { "acc": 0.7199761003784106, "acc_stderr": 0.004480929450281562, "acc_norm": 0.8926508663612827, "acc_norm_stderr": 0.0030892396746331585 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7185185185185186, "acc_stderr": 0.038850042458002526, "acc_norm": 0.7185185185185186, "acc_norm_stderr": 0.038850042458002526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.881578947368421, "acc_stderr": 0.026293995855474928, "acc_norm": 0.881578947368421, "acc_norm_stderr": 0.026293995855474928 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8452830188679246, "acc_stderr": 0.022257075558791282, "acc_norm": 0.8452830188679246, "acc_norm_stderr": 0.022257075558791282 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9305555555555556, "acc_stderr": 0.021257974822832048, "acc_norm": 0.9305555555555556, "acc_norm_stderr": 0.021257974822832048 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145633, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7456647398843931, "acc_stderr": 0.0332055644308557, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5686274509803921, "acc_stderr": 0.04928099597287534, "acc_norm": 0.5686274509803921, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.03942772444036622, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036622 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7914893617021277, "acc_stderr": 0.026556982117838728, "acc_norm": 0.7914893617021277, "acc_norm_stderr": 0.026556982117838728 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6140350877192983, "acc_stderr": 0.04579639422070434, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7724137931034483, "acc_stderr": 0.03493950380131184, "acc_norm": 0.7724137931034483, "acc_norm_stderr": 0.03493950380131184 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6904761904761905, "acc_stderr": 0.023809523809523864, "acc_norm": 0.6904761904761905, "acc_norm_stderr": 0.023809523809523864 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5714285714285714, "acc_stderr": 0.04426266681379909, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8838709677419355, "acc_stderr": 0.018225757949432306, "acc_norm": 0.8838709677419355, "acc_norm_stderr": 0.018225757949432306 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6600985221674877, "acc_stderr": 0.033327690684107895, "acc_norm": 0.6600985221674877, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066584, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066584 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9393939393939394, "acc_stderr": 0.016999994927421592, "acc_norm": 0.9393939393939394, "acc_norm_stderr": 0.016999994927421592 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9844559585492227, "acc_stderr": 0.008927492715084315, "acc_norm": 0.9844559585492227, "acc_norm_stderr": 0.008927492715084315 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8076923076923077, "acc_stderr": 0.019982347208637282, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.019982347208637282 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4703703703703704, "acc_stderr": 0.030431963547936584, "acc_norm": 0.4703703703703704, "acc_norm_stderr": 0.030431963547936584 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8445378151260504, "acc_stderr": 0.023536818625398904, "acc_norm": 0.8445378151260504, "acc_norm_stderr": 0.023536818625398904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5629139072847682, "acc_stderr": 0.040500357222306355, "acc_norm": 0.5629139072847682, "acc_norm_stderr": 0.040500357222306355 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9357798165137615, "acc_stderr": 0.010510494713201403, "acc_norm": 0.9357798165137615, "acc_norm_stderr": 0.010510494713201403 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6805555555555556, "acc_stderr": 0.03179876342176853, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.03179876342176853 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9117647058823529, "acc_stderr": 0.019907399791316945, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.019907399791316945 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9113924050632911, "acc_stderr": 0.018498315206865384, "acc_norm": 0.9113924050632911, "acc_norm_stderr": 0.018498315206865384 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.02693611191280227, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.02693611191280227 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8931297709923665, "acc_stderr": 0.027096548624883733, "acc_norm": 0.8931297709923665, "acc_norm_stderr": 0.027096548624883733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540616, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540616 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8611111111111112, "acc_stderr": 0.033432700628696195, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.033432700628696195 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8343558282208589, "acc_stderr": 0.029208296231259104, "acc_norm": 0.8343558282208589, "acc_norm_stderr": 0.029208296231259104 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6160714285714286, "acc_stderr": 0.04616143075028546, "acc_norm": 0.6160714285714286, "acc_norm_stderr": 0.04616143075028546 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.0349260647662379, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.0349260647662379 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253874, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253874 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.86, "acc_stderr": 0.034873508801977725, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977725 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9169859514687101, "acc_stderr": 0.009866287394639536, "acc_norm": 0.9169859514687101, "acc_norm_stderr": 0.009866287394639536 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8410404624277457, "acc_stderr": 0.019685307033571946, "acc_norm": 0.8410404624277457, "acc_norm_stderr": 0.019685307033571946 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6960893854748603, "acc_stderr": 0.01538284558758452, "acc_norm": 0.6960893854748603, "acc_norm_stderr": 0.01538284558758452 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8496732026143791, "acc_stderr": 0.02046417512433263, "acc_norm": 0.8496732026143791, "acc_norm_stderr": 0.02046417512433263 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.842443729903537, "acc_stderr": 0.020692237273583984, "acc_norm": 0.842443729903537, "acc_norm_stderr": 0.020692237273583984 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8641975308641975, "acc_stderr": 0.019061588181505405, "acc_norm": 0.8641975308641975, "acc_norm_stderr": 0.019061588181505405 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6560283687943262, "acc_stderr": 0.02833801742861133, "acc_norm": 0.6560283687943262, "acc_norm_stderr": 0.02833801742861133 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6023468057366362, "acc_stderr": 0.012499840347460642, "acc_norm": 0.6023468057366362, "acc_norm_stderr": 0.012499840347460642 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8345588235294118, "acc_stderr": 0.02257177102549473, "acc_norm": 0.8345588235294118, "acc_norm_stderr": 0.02257177102549473 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.815359477124183, "acc_stderr": 0.015697029240757773, "acc_norm": 0.815359477124183, "acc_norm_stderr": 0.015697029240757773 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7454545454545455, "acc_stderr": 0.04172343038705383, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.04172343038705383 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8163265306122449, "acc_stderr": 0.024789071332007646, "acc_norm": 0.8163265306122449, "acc_norm_stderr": 0.024789071332007646 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.021166216304659397, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.021166216304659397 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.93, "acc_stderr": 0.0256432399976243, "acc_norm": 0.93, "acc_norm_stderr": 0.0256432399976243 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.038444531817709175, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.038444531817709175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8713450292397661, "acc_stderr": 0.025679342723276894, "acc_norm": 0.8713450292397661, "acc_norm_stderr": 0.025679342723276894 }, "harness|truthfulqa:mc|0": { "mc1": 0.6560587515299877, "mc1_stderr": 0.016629087514276785, "mc2": 0.7666613083747418, "mc2_stderr": 0.014124410528709273 }, "harness|winogrande|5": { "acc": 0.850828729281768, "acc_stderr": 0.010012598805627305 }, "harness|gsm8k|5": { "acc": 0.7869598180439727, "acc_stderr": 0.01127844785690078 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
dmrau/cqudubstack-physics
--- configs: - config_name: default data_files: - split: queries path: data/queries-* - split: corpus path: data/corpus-* dataset_info: features: - name: _id dtype: string - name: text dtype: string - name: title dtype: string splits: - name: queries num_bytes: 73255 num_examples: 1039 - name: corpus num_bytes: 29949928 num_examples: 38316 download_size: 17827262 dataset_size: 30023183 --- # Dataset Card for "cqudubstack-physics" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered
--- pretty_name: Evaluation run of ausboss/llama7b-wizardlm-unfiltered dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ausboss/llama7b-wizardlm-unfiltered](https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-14T20:36:45.563210](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered/blob/main/results_2023-10-14T20-36-45.563210.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\ \ \"em_stderr\": 0.000377860919646065,\n \"f1\": 0.06781774328859062,\n\ \ \"f1_stderr\": 0.0014130588941040406,\n \"acc\": 0.3830910982884477,\n\ \ \"acc_stderr\": 0.009089439265290138\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.000377860919646065,\n\ \ \"f1\": 0.06781774328859062,\n \"f1_stderr\": 0.0014130588941040406\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \ \ \"acc_stderr\": 0.0056009875152378584\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342417\n\ \ }\n}\n```" repo_url: https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|arc:challenge|25_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-25T14:23:27.383189.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_14T20_36_45.563210 path: - '**/details_harness|drop|3_2023-10-14T20-36-45.563210.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-14T20-36-45.563210.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_14T20_36_45.563210 path: - '**/details_harness|gsm8k|5_2023-10-14T20-36-45.563210.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-14T20-36-45.563210.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hellaswag|10_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_25T14_23_27.383189 path: - '**/details_harness|truthfulqa:mc|0_2023-08-25T14:23:27.383189.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-25T14:23:27.383189.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_14T20_36_45.563210 path: - '**/details_harness|winogrande|5_2023-10-14T20-36-45.563210.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-14T20-36-45.563210.parquet' - config_name: results data_files: - split: 2023_08_25T14_23_27.383189 path: - results_2023-08-25T14:23:27.383189.parquet - split: 2023_10_14T20_36_45.563210 path: - results_2023-10-14T20-36-45.563210.parquet - split: latest path: - results_2023-10-14T20-36-45.563210.parquet --- # Dataset Card for Evaluation run of ausboss/llama7b-wizardlm-unfiltered ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [ausboss/llama7b-wizardlm-unfiltered](https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T20:36:45.563210](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered/blob/main/results_2023-10-14T20-36-45.563210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.000377860919646065, "f1": 0.06781774328859062, "f1_stderr": 0.0014130588941040406, "acc": 0.3830910982884477, "acc_stderr": 0.009089439265290138 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.000377860919646065, "f1": 0.06781774328859062, "f1_stderr": 0.0014130588941040406 }, "harness|gsm8k|5": { "acc": 0.043214556482183475, "acc_stderr": 0.0056009875152378584 }, "harness|winogrande|5": { "acc": 0.7229676400947119, "acc_stderr": 0.012577891015342417 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
mauro-nievoff/MultiCaRe_Dataset
--- license: cc-by-4.0 task_categories: - image-classification - image-to-text - text-to-image language: - en tags: - medical - images - computer vision - multimodal - text - clinical - nlp pretty_name: MultiCaRe Dataset --- The dataset contains multi-modal data from over 75,000 open access and de-identified case reports, including metadata, clinical cases, image captions and more than 130,000 images. Images and clinical cases belong to different medical specialties, such as oncology, cardiology, surgery and pathology. The structure of the dataset allows to easily map images with their corresponding article metadata, clinical case, captions and image labels. Details of the data structure can be found in the file data_dictionary.csv. Almost 100,000 patients and almost 400,000 medical doctors and researchers were involved in the creation of the articles included in this dataset. The citation data of each article can be found in the metadata.parquet file. Refer to the examples showcased in [this GitHub repository](https://github.com/mauro-nievoff/MultiCaRe_Dataset) to understand how to optimize the use of this dataset. For a detailed insight about the contents of this dataset, please refer to [this data article](https://www.sciencedirect.com/science/article/pii/S2352340923010351) published in Data In Brief. The dataset is also available on [Zenodo](https://zenodo.org/records/10079370).
DanielPFlorian/Hugging-Face-Datasets-Github-Issues
--- license: unknown task_categories: - text-classification - text-retrieval language: - en tags: - github - github-issues - datasets - huggingface pretty_name: Hugging Face Datasets Github Issues --- annotations_creators: - no-annotation language: - en language_creators: - found license: - unknown multilinguality: - monolingual pretty_name: Hugging Face Datasets Github Issues size_categories: - unknown source_datasets: - original tags: - github - github-issues - datasets - huggingface task_categories: - text-classification - text-retrieval task_ids: - multi-class-classification - multi-label-classification - document-retrieval
GEM/conversational_weather
--- annotations_creators: - none language_creators: - unknown language: - en license: - cc-by-nc-4.0 multilinguality: - unknown size_categories: - unknown source_datasets: - original task_categories: - table-to-text task_ids: [] pretty_name: conversational_weather tags: - data-to-text --- # Dataset Card for GEM/conversational_weather ## Dataset Description - **Homepage:** [Needs More Information] - **Repository:** https://github.com/facebookresearch/TreeNLG - **Paper:** https://aclanthology.org/P19-1080 - **Leaderboard:** N/A - **Point of Contact:** Kartikeya Upasani ### Link to Main Data Card You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/conversational_weather). ### Dataset Summary The purpose of this dataset is to assess how well a model can learn a template-like structure in a very low data setting. The task here is to produce a response to a weather-related query. The reply is further specified through the data attributes and discourse structure in the input. The output contains both the lexicalized text and discourse markers for attributes (e.g., `_ARG_TEMP_ 34`). You can load the dataset via: ``` import datasets data = datasets.load_dataset('GEM/conversational_weather') ``` The data loader can be found [here](https://huggingface.co/datasets/GEM/conversational_weather). #### paper [ACL Anthology](https://aclanthology.org/P19-1080) #### authors Anusha Balakrishnan, Jinfeng Rao, Kartikeya Upasani, Michael White, Rajen Subba (Facebook Conversational AI) ## Dataset Overview ### Where to find the Data and its Documentation #### Download <!-- info: What is the link to where the original dataset is hosted? --> <!-- scope: telescope --> [Github](https://github.com/facebookresearch/TreeNLG) #### Paper <!-- info: What is the link to the paper describing the dataset (open access preferred)? --> <!-- scope: telescope --> [ACL Anthology](https://aclanthology.org/P19-1080) #### BibTex <!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. --> <!-- scope: microscope --> ``` @inproceedings{balakrishnan-etal-2019-constrained, title = "Constrained Decoding for Neural {NLG} from Compositional Representations in Task-Oriented Dialogue", author = "Balakrishnan, Anusha and Rao, Jinfeng and Upasani, Kartikeya and White, Michael and Subba, Rajen", booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics", month = jul, year = "2019", address = "Florence, Italy", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/P19-1080", doi = "10.18653/v1/P19-1080", pages = "831--844" } ``` #### Contact Name <!-- quick --> <!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. --> <!-- scope: periscope --> Kartikeya Upasani #### Contact Email <!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. --> <!-- scope: periscope --> kart@fb.com #### Has a Leaderboard? <!-- info: Does the dataset have an active leaderboard? --> <!-- scope: telescope --> no ### Languages and Intended Use #### Multilingual? <!-- quick --> <!-- info: Is the dataset multilingual? --> <!-- scope: telescope --> no #### Covered Languages <!-- quick --> <!-- info: What languages/dialects are covered in the dataset? --> <!-- scope: telescope --> `English` #### License <!-- quick --> <!-- info: What is the license of the dataset? --> <!-- scope: telescope --> cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International #### Intended Use <!-- info: What is the intended use of the dataset? --> <!-- scope: microscope --> This dataset is intended to help develop conversational agents that exhibit human-like properties such as matching the framing of the response with the query or contrasting relevant data attributes. #### Primary Task <!-- info: What primary task does the dataset support? --> <!-- scope: telescope --> Data-to-Text #### Communicative Goal <!-- quick --> <!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. --> <!-- scope: periscope --> Producing a text that is a response to a weather query as per the discourse structure and data attributes specified in the input meaning representation. ### Credit #### Curation Organization Type(s) <!-- info: In what kind of organization did the dataset curation happen? --> <!-- scope: telescope --> `industry` #### Curation Organization(s) <!-- info: Name the organization(s). --> <!-- scope: periscope --> Facebook #### Dataset Creators <!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). --> <!-- scope: microscope --> Anusha Balakrishnan, Jinfeng Rao, Kartikeya Upasani, Michael White, Rajen Subba (Facebook Conversational AI) #### Funding <!-- info: Who funded the data creation? --> <!-- scope: microscope --> Facebook #### Who added the Dataset to GEM? <!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. --> <!-- scope: microscope --> Vipul Raheja (Grammarly) ### Dataset Structure #### Data Fields <!-- info: List and describe the fields present in the dataset. --> <!-- scope: telescope --> - `gem_id`: (string): GEM-formatted row id - `id`: (string): Row id in the original data - `user_query`: (string): Natural language weather query from humans - `tree_str_mr`: (string): Synthetically-added user context (datetime and location) in the form of a tree-structured MR - `response`: (string): A tree-structured annotation of the response. #### Example Instance <!-- info: Provide a JSON formatted example of a typical instance in the dataset. --> <!-- scope: periscope --> ``` {'gem_id': 'weather-train-11', 'id': '1108963', 'synthetic_user_context': '[__DG_INFORM__ [__ARG_TASK__ get_forecast ] ' '[__ARG_TEMP__ 37 ] [__ARG_TEMP_UNIT__ fahrenheit ] ' '[__ARG_CLOUD_COVERAGE__ partly cloudy ] ' '[__ARG_DATE_TIME__ [__ARG_COLLOQUIAL__ currently ] ' '] [__ARG_LOCATION__ [__ARG_CITY__ Oakland ] ' '[__ARG_COUNTRY__ United States ] [__ARG_REGION__ ' 'California ] ] ] [__DG_INFORM__ [__ARG_TASK__ ' 'get_forecast ] [__ARG_TEMP_SUMMARY__ mid 40s ] ' '[__ARG_DATE_TIME_RANGE__ [__ARG_COLLOQUIAL__ This ' 'afternoon ] ] [__ARG_LOCATION__ [__ARG_CITY__ ' 'Oakland ] [__ARG_COUNTRY__ United States ] ' '[__ARG_REGION__ California ] ] ] [__DG_INFORM__ ' '[__ARG_TASK__ get_forecast ] ' '[__ARG_CLOUD_COVERAGE__ mostly sunny ] ' '[__ARG_DATE_TIME_RANGE__ [__ARG_COLLOQUIAL__ This ' 'afternoon ] ] [__ARG_LOCATION__ [__ARG_CITY__ ' 'Oakland ] [__ARG_COUNTRY__ United States ] ' '[__ARG_REGION__ California ] ] ]', 'tree_str_mr': "[__DG_INFORM__ It's [__ARG_DATE_TIME__ [__ARG_COLLOQUIAL__ " 'currently ] ] [__ARG_CLOUD_COVERAGE__ partly cloudy ] and ' '[__ARG_TEMP__ __ARG_TEMP__ ] [__ARG_TEMP_UNIT__ ' '__ARG_TEMP_UNIT__ ] [__ARG_LOCATION__ in [__ARG_CITY__ ' '__ARG_CITY__ ] , [__ARG_REGION__ __ARG_REGION__ ] , ' '[__ARG_COUNTRY__ __ARG_COUNTRY__ ] ] . ] [__DG_INFORM__ ' '[__ARG_DATE_TIME_RANGE__ [__ARG_COLLOQUIAL__ This afternoon ] ' "] , it'll be [__ARG_CLOUD_COVERAGE__ mostly sunny ] ] " '[__DG_INFORM__ with temperatures in the [__ARG_TEMP_SUMMARY__ ' 'mid <number> ] ]', 'user_query': 'Show weather forecast for Oakland, CA. '} ``` #### Data Splits <!-- info: Describe and name the splits in the dataset if there are more than one. --> <!-- scope: periscope --> - Standard Splits: Train/Validation/Test - Additional Split: Disc_Test (a more challenging subset of the test set that contains discourse relations) #### Splitting Criteria <!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. --> <!-- scope: microscope --> The test set contains 3,121 examples, of which 1.1K (35%) have unique MRs that have never been seen in the training set. #### <!-- info: What does an outlier of the dataset in terms of length/perplexity/embedding look like? --> <!-- scope: microscope --> ``` {'gem_id': 'weather-train-13333', 'data_id': '1260610', 'user_query': 'Sundown', 'tree_str_mr': '[__DG_INFORM__ [__ARG_TASK__ get_weather_attribute ] [__ARG_SUNSET_TIME_DATE_TIME__ [__ARG_TIME__ 05:04 PM ] ] ]', 'response': '[__DG_INFORM__ The sun will go down at [__ARG_SUNSET_TIME_DATE_TIME__ [__ARG_TIME__ __ARG_TIME__ ] ] ]'} ``` ## Dataset in GEM ### Rationale for Inclusion in GEM #### Why is the Dataset in GEM? <!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? --> <!-- scope: microscope --> The dataset was curated to develop a weather bot that exhibits human-like properties such as matching the framing of the response with the query or contrasting relevant data attributes. The dataset offers rich tree-based meaning representations that offer fine-grained control over the response, e.g. by specifying which two attributes are to be contrasted. The natural language input queries are also provided to model the coherence of the response based on the input. The output response is annotated with the input meaning components using special bracketing tokens, which enables developing new techniques such as constrained decoding to improve quality of output responses #### Similar Datasets <!-- info: Do other datasets for the high level task exist? --> <!-- scope: telescope --> no #### Ability that the Dataset measures <!-- info: What aspect of model ability can be measured with this dataset? --> <!-- scope: periscope --> Adequately expressing CONTRAST and JUSTIFY discourse relations with appropriate grouping of arguments; adequately generalizing to many combinations of arguments. ### GEM-Specific Curation #### Modificatied for GEM? <!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? --> <!-- scope: telescope --> yes #### GEM Modifications <!-- info: What changes have been made to he original dataset? --> <!-- scope: periscope --> `data points removed` #### Modification Details <!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification --> <!-- scope: microscope --> The original repo contained a challenge set disc_test.tsv, which is a subset of the test set consisting of discourse relations (CONTRAST and JUSTIFY) , but also contained JOIN relations. This discrepancy has been rectified in the GEM version. The rectified version has been added in the `challenge_sets` #### Additional Splits? <!-- info: Does GEM provide additional splits to the dataset? --> <!-- scope: telescope --> no ### Getting Started with the Task ## Previous Results ### Previous Results #### Measured Model Abilities <!-- info: What aspect of model ability can be measured with this dataset? --> <!-- scope: telescope --> Adequately expressing CONTRAST and JUSTIFY discourse relations with appropriate grouping of arguments; adequately generalizing to many combinations of arguments. #### Metrics <!-- info: What metrics are typically used for this task? --> <!-- scope: periscope --> `BLEU`, `Other: Other Metrics` #### Other Metrics <!-- info: Definitions of other metrics --> <!-- scope: periscope --> Tree accuracy: It measures whether the tree structure in the prediction matches that of the input MR exactly (modulo repeated arguments that need only appear once). #### Proposed Evaluation <!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. --> <!-- scope: microscope --> Automatic metrics are evaluated on the raw model predictions (which have de-lexicalized fields): * Tree accuracy: Measures whether the tree structure in the prediction matches that of the input MR exactly. * BLEU-4: A word overlap metric commonly used for evaluating NLG systems. Authors also performed human evaluation studies by asking annotators to evaluate the quality of responses produced by different models. Annotators provided binary ratings on the following dimensions: β€’ Grammaticality: Measures fluency of the responses. β€’ Correctness: Measures semantic correctness of the responses. #### Previous results available? <!-- info: Are previous results available? --> <!-- scope: telescope --> no ## Dataset Curation ### Original Curation #### Original Curation Rationale <!-- info: Original curation rationale --> <!-- scope: telescope --> The dataset was curated to develop a weather bot that exhibits human-like properties such as matching the framing of the response with the query or contrasting relevant data attributes. To achieve this, the dataset contains rich tree-structured meaning representations that are specified using several data arguments and discourse acts, the input natural language queries, and annotations for the responses. #### Communicative Goal <!-- info: What was the communicative goal? --> <!-- scope: periscope --> Producing a text that is a response to a weather query as per the discourse structure and data attributes specified in the input meaning representation. #### Sourced from Different Sources <!-- info: Is the dataset aggregated from different data sources? --> <!-- scope: telescope --> no ### Language Data #### How was Language Data Obtained? <!-- info: How was the language data obtained? --> <!-- scope: telescope --> `Crowdsourced`, `Machine-generated` #### Where was it crowdsourced? <!-- info: If crowdsourced, where from? --> <!-- scope: periscope --> `Other crowdworker platform` #### Topics Covered <!-- info: Does the language in the dataset focus on specific topics? How would you describe them? --> <!-- scope: periscope --> The dataset is focused on the weather domain: Weather was the first successful case of NLG put into production back in the 80s (Reiter & Dale, 1997). This domain offers significant complexity for NLG. Weather forecast summaries in particular can be very long, and require reasoning over several disjoint pieces of information. #### Data Validation <!-- info: Was the text validated by a different worker or a data curator? --> <!-- scope: telescope --> validated by crowdworker #### Data Preprocessing <!-- info: How was the text data pre-processed? (Enter N/A if the text was not pre-processed) --> <!-- scope: microscope --> Please refer to Appendix D of the original paper for details. #### Was Data Filtered? <!-- info: Were text instances selected or filtered? --> <!-- scope: telescope --> hybrid #### Filter Criteria <!-- info: What were the selection criteria? --> <!-- scope: microscope --> Please refer to Appendix C of the original paper for details. ### Structured Annotations #### Additional Annotations? <!-- quick --> <!-- info: Does the dataset have additional annotations for each instance? --> <!-- scope: telescope --> none #### Annotation Service? <!-- info: Was an annotation service used? --> <!-- scope: telescope --> no ### Consent #### Any Consent Policy? <!-- info: Was there a consent policy involved when gathering the data? --> <!-- scope: telescope --> no #### Justification for Using the Data <!-- info: If not, what is the justification for reusing the data? --> <!-- scope: microscope --> Annotation was done as work for hire and contains no PII. ### Private Identifying Information (PII) #### Contains PII? <!-- quick --> <!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? --> <!-- scope: telescope --> no PII #### Justification for no PII <!-- info: Provide a justification for selecting `no PII` above. --> <!-- scope: periscope --> Data is simulated and not specific to annotator. ### Maintenance #### Any Maintenance Plan? <!-- info: Does the original dataset have a maintenance plan? --> <!-- scope: telescope --> no ## Broader Social Context ### Previous Work on the Social Impact of the Dataset #### Usage of Models based on the Data <!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? --> <!-- scope: telescope --> no ### Impact on Under-Served Communities #### Addresses needs of underserved Communities? <!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). --> <!-- scope: telescope --> no ### Discussion of Biases #### Any Documented Social Biases? <!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. --> <!-- scope: telescope --> unsure #### Are the Language Producers Representative of the Language? <!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? --> <!-- scope: periscope --> Grammatical evaluations performed with the data to date have used norms from informal Standard American English. These prescriptive notions of grammaticality potentially serve to perpetuate systemic power imbalances as they’re conveyed by language. Since the data only contains informal Standard American English, its use to train a model may not be appropriate depending on the potential use case. ## Considerations for Using the Data ### PII Risks and Liability #### Potential PII Risk <!-- info: Considering your answers to the PII part of the Data Curation Section, describe any potential privacy to the data subjects and creators risks when using the dataset. --> <!-- scope: microscope --> Annotation was done as work for hire and contains no PII. Annotated data is simulated and not specific to annotator. ### Licenses ### Known Technical Limitations #### Unsuited Applications <!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. --> <!-- scope: microscope --> An imperfect model used to convey actual weather data could mislead users about weather conditions?
Abcdefghijklmnopqrstuvwxyz12/HoR
--- license: apache-2.0 ---
GEM/TaTA
--- annotations_creators: - none language_creators: - unknown language: - ar - en - fr - ha - ig - pt - ru - sw - yo multilinguality: - yes size_categories: - unknown source_datasets: - original task_categories: - table-to-text task_ids: [] pretty_name: TaTA tags: - data-to-text license: cc-by-sa-4.0 dataset_info: features: - name: gem_id dtype: string - name: example_id dtype: string - name: title dtype: string - name: unit_of_measure dtype: string - name: chart_type dtype: string - name: was_translated dtype: string - name: table_data dtype: string - name: linearized_input dtype: string - name: table_text sequence: string - name: target dtype: string splits: - name: ru num_bytes: 308435 num_examples: 210 - name: test num_bytes: 1691383 num_examples: 763 - name: train num_bytes: 10019272 num_examples: 6962 - name: validation num_bytes: 1598442 num_examples: 754 download_size: 18543506 dataset_size: 13617532 --- # Dataset Card for GEM/TaTA ## Dataset Description - **Homepage:** https://github.com/google-research/url-nlp - **Repository:** https://github.com/google-research/url-nlp - **Paper:** https://arxiv.org/abs/2211.00142 - **Leaderboard:** https://github.com/google-research/url-nlp - **Point of Contact:** Sebastian Ruder ### Link to Main Data Card You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/TaTA). ### Dataset Summary Existing data-to-text generation datasets are mostly limited to English. Table-to-Text in African languages (TaTA) addresses this lack of data as the first large multilingual table-to-text dataset with a focus on African languages. TaTA was created by transcribing figures and accompanying text in bilingual reports by the Demographic and Health Surveys Program, followed by professional translation to make the dataset fully parallel. TaTA includes 8,700 examples in nine languages including four African languages (Hausa, Igbo, Swahili, and YorΓΉbΓ‘) and a zero-shot test language (Russian). You can load the dataset via: ``` import datasets data = datasets.load_dataset('GEM/TaTA') ``` The data loader can be found [here](https://huggingface.co/datasets/GEM/TaTA). #### website [Github](https://github.com/google-research/url-nlp) #### paper [ArXiv](https://arxiv.org/abs/2211.00142) #### authors Sebastian Gehrmann, Sebastian Ruder , Vitaly Nikolaev, Jan A. Botha, Michael Chavinda, Ankur Parikh, Clara Rivera ## Dataset Overview ### Where to find the Data and its Documentation #### Webpage <!-- info: What is the webpage for the dataset (if it exists)? --> <!-- scope: telescope --> [Github](https://github.com/google-research/url-nlp) #### Download <!-- info: What is the link to where the original dataset is hosted? --> <!-- scope: telescope --> [Github](https://github.com/google-research/url-nlp) #### Paper <!-- info: What is the link to the paper describing the dataset (open access preferred)? --> <!-- scope: telescope --> [ArXiv](https://arxiv.org/abs/2211.00142) #### BibTex <!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. --> <!-- scope: microscope --> ``` @misc{gehrmann2022TaTA, Author = {Sebastian Gehrmann and Sebastian Ruder and Vitaly Nikolaev and Jan A. Botha and Michael Chavinda and Ankur Parikh and Clara Rivera}, Title = {TaTa: A Multilingual Table-to-Text Dataset for African Languages}, Year = {2022}, Eprint = {arXiv:2211.00142}, } ``` #### Contact Name <!-- quick --> <!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. --> <!-- scope: periscope --> Sebastian Ruder #### Contact Email <!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. --> <!-- scope: periscope --> ruder@google.com #### Has a Leaderboard? <!-- info: Does the dataset have an active leaderboard? --> <!-- scope: telescope --> yes #### Leaderboard Link <!-- info: Provide a link to the leaderboard. --> <!-- scope: periscope --> [Github](https://github.com/google-research/url-nlp) #### Leaderboard Details <!-- info: Briefly describe how the leaderboard evaluates models. --> <!-- scope: microscope --> The paper introduces a metric StATA which is trained on human ratings and which is used to rank approaches submitted to the leaderboard. ### Languages and Intended Use #### Multilingual? <!-- quick --> <!-- info: Is the dataset multilingual? --> <!-- scope: telescope --> yes #### Covered Languages <!-- quick --> <!-- info: What languages/dialects are covered in the dataset? --> <!-- scope: telescope --> `English`, `Portuguese`, `Arabic`, `French`, `Hausa`, `Swahili (macrolanguage)`, `Igbo`, `Yoruba`, `Russian` #### Whose Language? <!-- info: Whose language is in the dataset? --> <!-- scope: periscope --> The language is taken from reports by the demographic and health surveys program. #### License <!-- quick --> <!-- info: What is the license of the dataset? --> <!-- scope: telescope --> cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International #### Intended Use <!-- info: What is the intended use of the dataset? --> <!-- scope: microscope --> The dataset poses significant reasoning challenges and is thus meant as a way to asses the verbalization and reasoning capabilities of structure-to-text models. #### Primary Task <!-- info: What primary task does the dataset support? --> <!-- scope: telescope --> Data-to-Text #### Communicative Goal <!-- quick --> <!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. --> <!-- scope: periscope --> Summarize key information from a table in a single sentence. ### Credit #### Curation Organization Type(s) <!-- info: In what kind of organization did the dataset curation happen? --> <!-- scope: telescope --> `industry` #### Curation Organization(s) <!-- info: Name the organization(s). --> <!-- scope: periscope --> Google Research #### Dataset Creators <!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). --> <!-- scope: microscope --> Sebastian Gehrmann, Sebastian Ruder , Vitaly Nikolaev, Jan A. Botha, Michael Chavinda, Ankur Parikh, Clara Rivera #### Funding <!-- info: Who funded the data creation? --> <!-- scope: microscope --> Google Research #### Who added the Dataset to GEM? <!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. --> <!-- scope: microscope --> Sebastian Gehrmann (Google Research) ### Dataset Structure #### Data Fields <!-- info: List and describe the fields present in the dataset. --> <!-- scope: telescope --> - `example_id`: The ID of the example. Each ID (e.g., `AB20-ar-1`) consists of three parts: the document ID, the language ISO 639-1 code, and the index of the table within the document. - `title`: The title of the table. - `unit_of_measure`: A description of the numerical value of the data. E.g., percentage of households with clean water. - `chart_type`: The kind of chart associated with the data. We consider the following (normalized) types: horizontal bar chart, map chart, pie graph, tables, line chart, pie chart, vertical chart type, line graph, vertical bar chart, and other. - `was_translated`: Whether the table was transcribed in the original language of the report or translated. - `table_data`: The table content is a JSON-encoded string of a two-dimensional list, organized by row, from left to right, starting from the top of the table. Number of items varies per table. Empty cells are given as empty string values in the corresponding table cell. - `table_text`: The sentences forming the description of each table are encoded as a JSON object. In the case of more than one sentence, these are separated by commas. Number of items varies per table. - `linearized_input`: A single string that contains the table content separated by vertical bars, i.e., |. Including title, unit of measurement, and the content of each cell including row and column headers in between brackets, i.e., (Medium Empowerment, Mali, 17.9). #### Reason for Structure <!-- info: How was the dataset structure determined? --> <!-- scope: microscope --> The structure includes all available information for the infographics on which the dataset is based. #### How were labels chosen? <!-- info: How were the labels chosen? --> <!-- scope: microscope --> Annotators looked through English text to identify sentences that describe an infographic. They then identified the corresponding location of the parallel non-English document. All sentences were extracted. #### Example Instance <!-- info: Provide a JSON formatted example of a typical instance in the dataset. --> <!-- scope: periscope --> ``` { "example_id": "FR346-en-39", "title": "Trends in early childhood mortality rates", "unit_of_measure": "Deaths per 1,000 live births for the 5-year period before the survey", "chart_type": "Line chart", "was_translated": "False", "table_data": "[[\"\", \"Child mortality\", \"Neonatal mortality\", \"Infant mortality\", \"Under-5 mortality\"], [\"1990 JPFHS\", 5, 21, 34, 39], [\"1997 JPFHS\", 6, 19, 29, 34], [\"2002 JPFHS\", 5, 16, 22, 27], [\"2007 JPFHS\", 2, 14, 19, 21], [\"2009 JPFHS\", 5, 15, 23, 28], [\"2012 JPFHS\", 4, 14, 17, 21], [\"2017-18 JPFHS\", 3, 11, 17, 19]]", "table_text": [ "neonatal, infant, child, and under-5 mortality rates for the 5 years preceding each of seven JPFHS surveys (1990 to 2017-18).", "Under-5 mortality declined by half over the period, from 39 to 19 deaths per 1,000 live births.", "The decline in mortality was much greater between the 1990 and 2007 surveys than in the most recent period.", "Between 2012 and 2017-18, under-5 mortality decreased only modestly, from 21 to 19 deaths per 1,000 live births, and infant mortality remained stable at 17 deaths per 1,000 births." ], "linearized_input": "Trends in early childhood mortality rates | Deaths per 1,000 live births for the 5-year period before the survey | (Child mortality, 1990 JPFHS, 5) (Neonatal mortality, 1990 JPFHS, 21) (Infant mortality, 1990 JPFHS, 34) (Under-5 mortality, 1990 JPFHS, 39) (Child mortality, 1997 JPFHS, 6) (Neonatal mortality, 1997 JPFHS, 19) (Infant mortality, 1997 JPFHS, 29) (Under-5 mortality, 1997 JPFHS, 34) (Child mortality, 2002 JPFHS, 5) (Neonatal mortality, 2002 JPFHS, 16) (Infant mortality, 2002 JPFHS, 22) (Under-5 mortality, 2002 JPFHS, 27) (Child mortality, 2007 JPFHS, 2) (Neonatal mortality, 2007 JPFHS, 14) (Infant mortality, 2007 JPFHS, 19) (Under-5 mortality, 2007 JPFHS, 21) (Child mortality, 2009 JPFHS, 5) (Neonatal mortality, 2009 JPFHS, 15) (Infant mortality, 2009 JPFHS, 23) (Under-5 mortality, 2009 JPFHS, 28) (Child mortality, 2012 JPFHS, 4) (Neonatal mortality, 2012 JPFHS, 14) (Infant mortality, 2012 JPFHS, 17) (Under-5 mortality, 2012 JPFHS, 21) (Child mortality, 2017-18 JPFHS, 3) (Neonatal mortality, 2017-18 JPFHS, 11) (Infant mortality, 2017-18 JPFHS, 17) (Under-5 mortality, 2017-18 JPFHS, 19)" } ``` #### Data Splits <!-- info: Describe and name the splits in the dataset if there are more than one. --> <!-- scope: periscope --> - `Train`: Training set, includes examples with 0 or more references. - `Validation`: Validation set, includes examples with 3 or more references. - `Test`: Test set, includes examples with 3 or more references. - `Ru`: Russian zero-shot set. Includes English and Russian examples (Russian is not includes in any of the other splits). #### Splitting Criteria <!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. --> <!-- scope: microscope --> The same table across languages is always in the same split, i.e., if table X is in the test split in language A, it will also be in the test split in language B. In addition to filtering examples without transcribed table values, every example of the development and test splits has at least 3 references. From the examples that fulfilled these criteria, 100 tables were sampled for both development and test for a total of 800 examples each. A manual review process excluded a few tables in each set, resulting in a training set of 6,962 tables, a development set of 752 tables, and a test set of 763 tables. #### <!-- info: What does an outlier of the dataset in terms of length/perplexity/embedding look like? --> <!-- scope: microscope --> There are tables without references, without values, and others that are very large. The dataset is distributed as-is, but the paper describes multiple strategies to deal with data issues. ## Dataset in GEM ### Rationale for Inclusion in GEM #### Why is the Dataset in GEM? <!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? --> <!-- scope: microscope --> There is no other multilingual data-to-text dataset that is parallel over languages. Moreover, over 70% of references in the dataset require reasoning and it is thus of very high quality and challenging for models. #### Similar Datasets <!-- info: Do other datasets for the high level task exist? --> <!-- scope: telescope --> yes #### Unique Language Coverage <!-- info: Does this dataset cover other languages than other datasets for the same task? --> <!-- scope: periscope --> yes #### Difference from other GEM datasets <!-- info: What else sets this dataset apart from other similar datasets in GEM? --> <!-- scope: microscope --> More languages, parallel across languages, grounded in infographics, not centered on Western entities or source documents #### Ability that the Dataset measures <!-- info: What aspect of model ability can be measured with this dataset? --> <!-- scope: periscope --> reasoning, verbalization, content planning ### GEM-Specific Curation #### Modificatied for GEM? <!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? --> <!-- scope: telescope --> no #### Additional Splits? <!-- info: Does GEM provide additional splits to the dataset? --> <!-- scope: telescope --> no ### Getting Started with the Task #### Pointers to Resources <!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. --> <!-- scope: microscope --> The background section of the [paper](https://arxiv.org/abs/2211.00142) provides a list of related datasets. #### Technical Terms <!-- info: Technical terms used in this card and the dataset and their definitions --> <!-- scope: microscope --> - `data-to-text`: Term that refers to NLP tasks in which the input is structured information and the output is natural language. ## Previous Results ### Previous Results #### Metrics <!-- info: What metrics are typically used for this task? --> <!-- scope: periscope --> `Other: Other Metrics` #### Other Metrics <!-- info: Definitions of other metrics --> <!-- scope: periscope --> `StATA`: A new metric associated with TaTA that is trained on human judgments and which has a much higher correlation with them. #### Proposed Evaluation <!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. --> <!-- scope: microscope --> The creators used a human evaluation that measured [attribution](https://arxiv.org/abs/2112.12870) and reasoning capabilities of various models. Based on these ratings, they trained a new metric and showed that existing metrics fail to measure attribution. #### Previous results available? <!-- info: Are previous results available? --> <!-- scope: telescope --> no ## Dataset Curation ### Original Curation #### Original Curation Rationale <!-- info: Original curation rationale --> <!-- scope: telescope --> The curation rationale is to create a multilingual data-to-text dataset that is high-quality and challenging. #### Communicative Goal <!-- info: What was the communicative goal? --> <!-- scope: periscope --> The communicative goal is to describe a table in a single sentence. #### Sourced from Different Sources <!-- info: Is the dataset aggregated from different data sources? --> <!-- scope: telescope --> no ### Language Data #### How was Language Data Obtained? <!-- info: How was the language data obtained? --> <!-- scope: telescope --> `Found` #### Where was it found? <!-- info: If found, where from? --> <!-- scope: telescope --> `Single website` #### Language Producers <!-- info: What further information do we have on the language producers? --> <!-- scope: microscope --> The language was produced by USAID as part of the Demographic and Health Surveys program (https://dhsprogram.com/). #### Topics Covered <!-- info: Does the language in the dataset focus on specific topics? How would you describe them? --> <!-- scope: periscope --> The topics are related to fertility, family planning, maternal and child health, gender, and nutrition. #### Data Validation <!-- info: Was the text validated by a different worker or a data curator? --> <!-- scope: telescope --> validated by crowdworker #### Was Data Filtered? <!-- info: Were text instances selected or filtered? --> <!-- scope: telescope --> not filtered ### Structured Annotations #### Additional Annotations? <!-- quick --> <!-- info: Does the dataset have additional annotations for each instance? --> <!-- scope: telescope --> expert created #### Number of Raters <!-- info: What is the number of raters --> <!-- scope: telescope --> 11<n<50 #### Rater Qualifications <!-- info: Describe the qualifications required of an annotator. --> <!-- scope: periscope --> Professional annotator who is a fluent speaker of the respective language #### Raters per Training Example <!-- info: How many annotators saw each training example? --> <!-- scope: periscope --> 0 #### Raters per Test Example <!-- info: How many annotators saw each test example? --> <!-- scope: periscope --> 1 #### Annotation Service? <!-- info: Was an annotation service used? --> <!-- scope: telescope --> yes #### Which Annotation Service <!-- info: Which annotation services were used? --> <!-- scope: periscope --> `other` #### Annotation Values <!-- info: Purpose and values for each annotation --> <!-- scope: microscope --> The additional annotations are for system outputs and references and serve to develop metrics for this task. #### Any Quality Control? <!-- info: Quality control measures? --> <!-- scope: telescope --> validated by data curators #### Quality Control Details <!-- info: Describe the quality control measures that were taken. --> <!-- scope: microscope --> Ratings were compared to a small (English) expert-curated set of ratings to ensure high agreement. There were additional rounds of training and feedback to annotators to ensure high quality judgments. ### Consent #### Any Consent Policy? <!-- info: Was there a consent policy involved when gathering the data? --> <!-- scope: telescope --> yes #### Other Consented Downstream Use <!-- info: What other downstream uses of the data did the original data creators and the data curators consent to? --> <!-- scope: microscope --> In addition to data-to-text generation, the dataset can be used for translation or multimodal research. ### Private Identifying Information (PII) #### Contains PII? <!-- quick --> <!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? --> <!-- scope: telescope --> no PII #### Justification for no PII <!-- info: Provide a justification for selecting `no PII` above. --> <!-- scope: periscope --> The DHS program only publishes aggregate survey information and thus, no personal information is included. ### Maintenance #### Any Maintenance Plan? <!-- info: Does the original dataset have a maintenance plan? --> <!-- scope: telescope --> no ## Broader Social Context ### Previous Work on the Social Impact of the Dataset #### Usage of Models based on the Data <!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? --> <!-- scope: telescope --> no ### Impact on Under-Served Communities #### Addresses needs of underserved Communities? <!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). --> <!-- scope: telescope --> yes #### Details on how Dataset Addresses the Needs <!-- info: Describe how this dataset addresses the needs of underserved communities. --> <!-- scope: microscope --> The dataset is focusing on data about African countries and the languages included in the dataset are spoken in Africa. It aims to improve the representation of African languages in the NLP and NLG communities. ### Discussion of Biases #### Any Documented Social Biases? <!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. --> <!-- scope: telescope --> no #### Are the Language Producers Representative of the Language? <!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? --> <!-- scope: periscope --> The language producers for this dataset are those employed by the DHS program which is a US-funded program. While the data is focused on African countries, there may be implicit western biases in how the data is presented. ## Considerations for Using the Data ### PII Risks and Liability ### Licenses #### Copyright Restrictions on the Dataset <!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? --> <!-- scope: periscope --> `open license - commercial use allowed` #### Copyright Restrictions on the Language Data <!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? --> <!-- scope: periscope --> `open license - commercial use allowed` ### Known Technical Limitations #### Technical Limitations <!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. --> <!-- scope: microscope --> While tables were transcribed in the available languages, the majority of the tables were published in English as the first language. Professional translators were used to translate the data, which makes it plausible that some translationese exists in the data. Moreover, it was unavoidable to collect reference sentences that are only partially entailed by the source tables. #### Unsuited Applications <!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. --> <!-- scope: microscope --> The domain of health reports includes potentially sensitive topics relating to reproduction, violence, sickness, and death. Perceived negative values could be used to amplify stereotypes about people from the respective regions or countries. The intended academic use of this dataset is to develop and evaluate models that neutrally report the content of these tables but not use the outputs to make value judgments, and these applications are thus discouraged.
hfvladkon/WiNER
--- dataset_info: features: - name: id dtype: string - name: text dtype: string - name: tokens sequence: string - name: pos_tags sequence: string - name: ner_tags sequence: string splits: - name: train num_bytes: 133047685 num_examples: 203286 download_size: 46621835 dataset_size: 133047685 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "WiNER" ## WiNER: A Wikipedia Annotated Corpus for Named Entity Recognition ## Sample ```json {'id': '1', 'text': 'Π’ Π΄ΠΎΠ³ΠΎΠ²ΠΎΡ€Π΅ срСди 5 ΡΡ‚Π°Ρ€ΡˆΠΈΡ… князСй упоминаСтся Миндовг .', 'tokens': ['Π’', 'Π΄ΠΎΠ³ΠΎΠ²ΠΎΡ€Π΅', 'срСди', '5', 'ΡΡ‚Π°Ρ€ΡˆΠΈΡ…', 'князСй', 'упоминаСтся', 'Миндовг', '.'], 'pos_tags': ['PR', 'S', 'PR', 'NUM', 'A', 'S', 'V', 'S', 'SENT'], 'ner_tags': ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'I-PER', 'O']} ``` ## Citation [WiNER: A Wikipedia Annotated Corpus for Named Entity Recognition](https://aclanthology.org/I17-1042/) [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_NovoCode__Novocode7b
--- pretty_name: Evaluation run of NovoCode/Novocode7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [NovoCode/Novocode7b](https://huggingface.co/NovoCode/Novocode7b) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Novocode7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-23T01:09:59.087164](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b/blob/main/results_2024-01-23T01-09-59.087164.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5637380070206868,\n\ \ \"acc_stderr\": 0.03397699301826096,\n \"acc_norm\": 0.5694898071045811,\n\ \ \"acc_norm_stderr\": 0.03471749621521052,\n \"mc1\": 0.4663402692778458,\n\ \ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6276801807189292,\n\ \ \"mc2_stderr\": 0.015415755094430335\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.01454451988063383,\n\ \ \"acc_norm\": 0.5878839590443686,\n \"acc_norm_stderr\": 0.014383915302225403\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6214897430790679,\n\ \ \"acc_stderr\": 0.004840244782805302,\n \"acc_norm\": 0.8051185022903804,\n\ \ \"acc_norm_stderr\": 0.003952999181084448\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\ \ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\ \ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n\ \ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\ \ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\ \ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\ \ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\ \ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\ \ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\ \ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\ \ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\ \ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\ \ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\ acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6451612903225806,\n \"acc_stderr\": 0.027218889773308753,\n \"\ acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.027218889773308753\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n \"\ acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\ \ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\ \ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \ \ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \ \ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\ acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\ acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399814,\n \"\ acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399814\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990403,\n \ \ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990403\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\ \ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\ \ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\ \ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\ acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\ \ \"acc_stderr\": 0.0458790474130181,\n \"acc_norm\": 0.6574074074074074,\n\ \ \"acc_norm_stderr\": 0.0458790474130181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\ \ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\ \ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\ \ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\ \ \"acc_stderr\": 0.015464676163395965,\n \"acc_norm\": 0.7509578544061303,\n\ \ \"acc_norm_stderr\": 0.015464676163395965\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\ \ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\ \ \"acc_stderr\": 0.016232826818678513,\n \"acc_norm\": 0.37988826815642457,\n\ \ \"acc_norm_stderr\": 0.016232826818678513\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n\ \ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\ \ \"acc_stderr\": 0.02736807824397165,\n \"acc_norm\": 0.6334405144694534,\n\ \ \"acc_norm_stderr\": 0.02736807824397165\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n\ \ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \ \ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3859191655801825,\n\ \ \"acc_stderr\": 0.012433398911476143,\n \"acc_norm\": 0.3859191655801825,\n\ \ \"acc_norm_stderr\": 0.012433398911476143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\ \ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5245098039215687,\n \"acc_stderr\": 0.02020351728026144,\n \ \ \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.02020351728026144\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\ \ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\ \ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\ \ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\ \ \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n\ \ \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\ \ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\ \ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\ \ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\ \ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6276801807189292,\n\ \ \"mc2_stderr\": 0.015415755094430335\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773218\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \ \ \"acc_stderr\": 0.011600249020595822\n }\n}\n```" repo_url: https://huggingface.co/NovoCode/Novocode7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|arc:challenge|25_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|arc:challenge|25_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|arc:challenge|25_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-23T01-09-59.087164.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|gsm8k|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|gsm8k|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|gsm8k|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hellaswag|10_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hellaswag|10_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hellaswag|10_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-14T21-20-28.943538.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T00-46-49.917108.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-management|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T01-09-59.087164.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|truthfulqa:mc|0_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-23T01-09-59.087164.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_14T21_20_28.943538 path: - '**/details_harness|winogrande|5_2024-01-14T21-20-28.943538.parquet' - split: 2024_01_23T00_46_49.917108 path: - '**/details_harness|winogrande|5_2024-01-23T00-46-49.917108.parquet' - split: 2024_01_23T01_09_59.087164 path: - '**/details_harness|winogrande|5_2024-01-23T01-09-59.087164.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-23T01-09-59.087164.parquet' - config_name: results data_files: - split: 2024_01_14T21_20_28.943538 path: - results_2024-01-14T21-20-28.943538.parquet - split: 2024_01_23T00_46_49.917108 path: - results_2024-01-23T00-46-49.917108.parquet - split: 2024_01_23T01_09_59.087164 path: - results_2024-01-23T01-09-59.087164.parquet - split: latest path: - results_2024-01-23T01-09-59.087164.parquet --- # Dataset Card for Evaluation run of NovoCode/Novocode7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NovoCode/Novocode7b](https://huggingface.co/NovoCode/Novocode7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NovoCode__Novocode7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-23T01:09:59.087164](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b/blob/main/results_2024-01-23T01-09-59.087164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5637380070206868, "acc_stderr": 0.03397699301826096, "acc_norm": 0.5694898071045811, "acc_norm_stderr": 0.03471749621521052, "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6276801807189292, "mc2_stderr": 0.015415755094430335 }, "harness|arc:challenge|25": { "acc": 0.5477815699658704, "acc_stderr": 0.01454451988063383, "acc_norm": 0.5878839590443686, "acc_norm_stderr": 0.014383915302225403 }, "harness|hellaswag|10": { "acc": 0.6214897430790679, "acc_stderr": 0.004840244782805302, "acc_norm": 0.8051185022903804, "acc_norm_stderr": 0.003952999181084448 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.04046336883978251, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.04046336883978251 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.02964781353936525, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.02964781353936525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956913, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.032683358999363366, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.04630653203366595, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.04630653203366595 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.32275132275132273, "acc_stderr": 0.024078943243597016, "acc_norm": 0.32275132275132273, "acc_norm_stderr": 0.024078943243597016 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6451612903225806, "acc_stderr": 0.027218889773308753, "acc_norm": 0.6451612903225806, "acc_norm_stderr": 0.027218889773308753 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03481904844438803, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03481904844438803 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512567, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512567 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7875647668393783, "acc_stderr": 0.02951928261681723, "acc_norm": 0.7875647668393783, "acc_norm_stderr": 0.02951928261681723 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.541025641025641, "acc_stderr": 0.025265525491284295, "acc_norm": 0.541025641025641, "acc_norm_stderr": 0.025265525491284295 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524575, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524575 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5462184873949579, "acc_stderr": 0.03233943468182088, "acc_norm": 0.5462184873949579, "acc_norm_stderr": 0.03233943468182088 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7504587155963303, "acc_stderr": 0.018553897629501628, "acc_norm": 0.7504587155963303, "acc_norm_stderr": 0.018553897629501628 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608044, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608044 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7156862745098039, "acc_stderr": 0.03166009679399814, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.03166009679399814 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.679324894514768, "acc_stderr": 0.030381931949990403, "acc_norm": 0.679324894514768, "acc_norm_stderr": 0.030381931949990403 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6502242152466368, "acc_stderr": 0.03200736719484503, "acc_norm": 0.6502242152466368, "acc_norm_stderr": 0.03200736719484503 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6574074074074074, "acc_stderr": 0.0458790474130181, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.0458790474130181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.03559039531617342, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.03559039531617342 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7509578544061303, "acc_stderr": 0.015464676163395965, "acc_norm": 0.7509578544061303, "acc_norm_stderr": 0.015464676163395965 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806642, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806642 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37988826815642457, "acc_stderr": 0.016232826818678513, "acc_norm": 0.37988826815642457, "acc_norm_stderr": 0.016232826818678513 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6045751633986928, "acc_stderr": 0.027996723180631462, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.027996723180631462 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6334405144694534, "acc_stderr": 0.02736807824397165, "acc_norm": 0.6334405144694534, "acc_norm_stderr": 0.02736807824397165 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5925925925925926, "acc_stderr": 0.027339546640662737, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.027339546640662737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4078014184397163, "acc_stderr": 0.02931601177634356, "acc_norm": 0.4078014184397163, "acc_norm_stderr": 0.02931601177634356 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3859191655801825, "acc_stderr": 0.012433398911476143, "acc_norm": 0.3859191655801825, "acc_norm_stderr": 0.012433398911476143 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5367647058823529, "acc_stderr": 0.03029061918048569, "acc_norm": 0.5367647058823529, "acc_norm_stderr": 0.03029061918048569 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5245098039215687, "acc_stderr": 0.02020351728026144, "acc_norm": 0.5245098039215687, "acc_norm_stderr": 0.02020351728026144 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5877551020408164, "acc_stderr": 0.03151236044674268, "acc_norm": 0.5877551020408164, "acc_norm_stderr": 0.03151236044674268 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623326, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623326 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.030944459778533193, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.030944459778533193 }, "harness|truthfulqa:mc|0": { "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6276801807189292, "mc2_stderr": 0.015415755094430335 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.011616198215773218 }, "harness|gsm8k|5": { "acc": 0.2304776345716452, "acc_stderr": 0.011600249020595822 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
PipableAI/spider-hardness
--- dataset_info: features: - name: schema dtype: string - name: question dtype: string - name: query dtype: string - name: hardness dtype: string splits: - name: train num_bytes: 6507829 num_examples: 7000 download_size: 404015 dataset_size: 6507829 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "spider-hardness" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/yuubari_kantaicollection
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of yuubari/倕弡 (Kantai Collection) This is the dataset of yuubari/倕弡 (Kantai Collection), containing 500 images and their tags. The core tags of this character are `ponytail, brown_eyes, green_hair, bow, bangs, hair_bow, grey_hair, long_hair, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 516.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuubari_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 321.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuubari_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1232 | 702.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuubari_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 465.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuubari_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1232 | 940.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuubari_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/yuubari_kantaicollection', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | playboy_bunny, 1girl, looking_at_viewer, rabbit_ears, solo, wrist_cuffs, detached_collar, fake_animal_ears, simple_background, strapless_leotard, white_background, bowtie, rabbit_tail, black_leotard, black_pantyhose, cowboy_shot, dated, green_bow, small_breasts, alternate_costume, cleavage, hair_ribbon, medium_breasts | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bowtie, green_skirt, pleated_skirt, serafuku, short_sleeves, solo, looking_at_viewer, black_pantyhose, midriff, navel, smile, crop_top, open_mouth, machinery, blush | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, serafuku, short_sleeves, solo, upper_body, looking_at_viewer, orange_bowtie, sailor_collar, simple_background, green_bow, shirt, smile, white_background, open_mouth | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_shirt, grey_sailor_collar, hair_ribbon, looking_at_viewer, one-hour_drawing_challenge, orange_neckerchief, serafuku, simple_background, solo, white_background, dated, green_skirt, midriff, navel, upper_body, white_ribbon, twitter_username, crop_top, pleated_skirt, short_sleeves | | 4 | 24 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, pleated_skirt, serafuku, solo, grey_sailor_collar, hair_ribbon, orange_neckerchief, green_skirt, white_ribbon, looking_at_viewer, black_pantyhose, midriff, navel, black_shirt, cowboy_shot, white_background, grey_skirt, orange_necktie, simple_background, crop_top, smile, black_belt, short_sleeves | | 5 | 35 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, solo, black_bikini, looking_at_viewer, hair_ribbon, white_ribbon, cowboy_shot, simple_background, navel, small_breasts, smile, white_background, side-tie_bikini_bottom, one-hour_drawing_challenge | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, solo, jumpsuit, looking_at_viewer, navel, smile, white_tank_top, cowboy_shot, midriff, collarbone, one-hour_drawing_challenge, blush, medium_breasts, one_eye_closed, pants, twitter_username | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_headwear, black_sweater, hat, long_sleeves, solo, white_shirt, green_skirt, looking_at_viewer, official_alternate_costume, twitter_username, sitting, collared_shirt, one-hour_drawing_challenge, open_mouth, smile | | 8 | 23 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, nipples, navel, 1boy, hetero, open_mouth, sex, solo_focus, small_breasts, penis, vaginal, bar_censor, cum_in_pussy, female_pubic_hair, looking_at_viewer, shirt_lift | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, alternate_costume, kimono, looking_at_viewer, obi, solo, wide_sleeves, open_mouth, ribbon, smile, happy_new_year | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | playboy_bunny | 1girl | looking_at_viewer | rabbit_ears | solo | wrist_cuffs | detached_collar | fake_animal_ears | simple_background | strapless_leotard | white_background | bowtie | rabbit_tail | black_leotard | black_pantyhose | cowboy_shot | dated | green_bow | small_breasts | alternate_costume | cleavage | hair_ribbon | medium_breasts | green_skirt | pleated_skirt | serafuku | short_sleeves | midriff | navel | smile | crop_top | open_mouth | machinery | blush | upper_body | orange_bowtie | sailor_collar | shirt | black_shirt | grey_sailor_collar | one-hour_drawing_challenge | orange_neckerchief | white_ribbon | twitter_username | grey_skirt | orange_necktie | black_belt | black_bikini | side-tie_bikini_bottom | jumpsuit | white_tank_top | collarbone | one_eye_closed | pants | black_headwear | black_sweater | hat | long_sleeves | white_shirt | official_alternate_costume | sitting | collared_shirt | nipples | 1boy | hetero | sex | solo_focus | penis | vaginal | bar_censor | cum_in_pussy | female_pubic_hair | shirt_lift | kimono | obi | wide_sleeves | ribbon | happy_new_year | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------|:--------|:--------------------|:--------------|:-------|:--------------|:------------------|:-------------------|:--------------------|:--------------------|:-------------------|:---------|:--------------|:----------------|:------------------|:--------------|:--------|:------------|:----------------|:--------------------|:-----------|:--------------|:-----------------|:--------------|:----------------|:-----------|:----------------|:----------|:--------|:--------|:-----------|:-------------|:------------|:--------|:-------------|:----------------|:----------------|:--------|:--------------|:---------------------|:-----------------------------|:---------------------|:---------------|:-------------------|:-------------|:-----------------|:-------------|:---------------|:-------------------------|:-----------|:-----------------|:-------------|:-----------------|:--------|:-----------------|:----------------|:------|:---------------|:--------------|:-----------------------------|:----------|:-----------------|:----------|:-------|:---------|:------|:-------------|:--------|:----------|:-------------|:---------------|:--------------------|:-------------|:---------|:------|:---------------|:---------|:-----------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | X | X | | X | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | X | X | | X | | | | X | | X | | | | | | | X | | | | | | | | X | X | | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | X | | X | | | | X | | X | | | | | | X | | | | | X | | X | X | X | X | X | X | | X | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 24 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | X | X | | X | | | | X | | X | | | | X | X | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | X | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 35 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | X | X | | X | | | | X | | X | | | | | X | | | X | | | X | | | | | | | X | X | | | | | | | | | | | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | | X | X | | X | | | | | | | | | | | X | | | | | | | X | | | | | X | X | X | | | | X | | | | | | | X | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | | | | | | | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 8 | 23 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | | X | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
zhangshuoming/c_x86_avx2_extension_filtered
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 856916.0 num_examples: 1101 download_size: 129124 dataset_size: 856916.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "c_x86_avx2_extension_filtered" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hopee4/summer
--- license: openrail ---
Weyaxi/huggingface-leaderboard
--- viewer: false --- # Huggingface Leaderboard's History Dataset πŸ† This is the history dataset of [Huggingface Leaderboard](https://huggingface.co/spaces/PulsarAI/huggingface-leaderboard). πŸ—’οΈ This dataset contains full dataframes in a CSV file for each time lapse. βŒ› This dataset is automatically updated when space restarts. (Which is approximately every 6 hours) ## Leaderboard Link πŸ”— [PulsarAI/huggingface-leaderboard](https://huggingface.co/spaces/PulsarAI/huggingface-leaderboard)
luizotavio22/diozinho
--- license: apache-2.0 ---
distilled-one-sec-cv12-each-chunk-uniq/chunk_47
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1189674580.0 num_examples: 231815 download_size: 1215359543 dataset_size: 1189674580.0 --- # Dataset Card for "chunk_47" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Vanimal0221/VaanceFace
--- license: artistic-2.0 ---
rishabhjain16/alb_t2
--- dataset_info: features: - name: audio dtype: audio - name: transcription dtype: string splits: - name: test num_bytes: 403933.0 num_examples: 6 download_size: 404365 dataset_size: 403933.0 configs: - config_name: default data_files: - split: test path: data/test-* ---
CyberHarem/tone_kantaicollection
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of tone/利根/利根 (Kantai Collection) This is the dataset of tone/利根/利根 (Kantai Collection), containing 500 images and their tags. The core tags of this character are `long_hair, twintails, ribbon, brown_hair, hair_ribbon, white_ribbon, brown_eyes, hair_between_eyes, breasts, bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 492.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tone_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 314.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tone_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1143 | 658.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tone_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 447.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tone_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1143 | 882.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tone_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tone_kantaicollection', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, pelvic_curtain, single_elbow_glove, single_thighhigh, smile, solo, uneven_legwear, black_gloves, side_slit, looking_at_viewer, single_glove, boots, no_panties, open_mouth, hand_on_hip | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, military_uniform, pelvic_curtain, simple_background, solo, white_background, dated, looking_at_viewer, one-hour_drawing_challenge, single_thighhigh, sitting, twitter_username, uneven_legwear, black_thighhighs, single_elbow_glove, black_footwear, black_gloves, boots, red_bowtie | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, looking_at_viewer, fang, :d, open_mouth, upper_body | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, simple_background, solo, closed_mouth, collarbone, small_breasts, blush, looking_at_viewer, micro_bikini, white_background, green_eyes, navel, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | pelvic_curtain | single_elbow_glove | single_thighhigh | smile | solo | uneven_legwear | black_gloves | side_slit | looking_at_viewer | single_glove | boots | no_panties | open_mouth | hand_on_hip | military_uniform | simple_background | white_background | dated | one-hour_drawing_challenge | sitting | twitter_username | black_thighhighs | black_footwear | red_bowtie | fang | :d | upper_body | closed_mouth | collarbone | small_breasts | blush | micro_bikini | green_eyes | navel | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------------|:-------------------|:--------|:-------|:-----------------|:---------------|:------------|:--------------------|:---------------|:--------|:-------------|:-------------|:--------------|:-------------------|:--------------------|:-------------------|:--------|:-----------------------------|:----------|:-------------------|:-------------------|:-----------------|:-------------|:-------|:-----|:-------------|:---------------|:-------------|:----------------|:--------|:---------------|:-------------|:--------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | X | | | | X | | | | X | | | | | | | | | | | | X | X | X | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | X | X | | | | X | | | | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X |
DeliberatorArchiver/movie_binaries_0014
--- license: cc-by-nc-nd-4.0 viewer: false ---
haixuantao/dora-rs
--- license: mit ---
shiroup/surtr_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of surtr/γ‚Ήγƒ«γƒˆ/史尔特尔 (Arknights) This is the dataset of surtr/γ‚Ήγƒ«γƒˆ/史尔特尔 (Arknights), containing 500 images and their tags. The core tags of this character are `horns, red_hair, long_hair, purple_eyes, bangs, breasts, very_long_hair, hair_between_eyes, medium_breasts, large_breasts, demon_horns`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 524.06 MiB | [Download](https://huggingface.co/datasets/shiroup/surtr_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 425.99 MiB | [Download](https://huggingface.co/datasets/shiroup/surtr_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1179 | 794.27 MiB | [Download](https://huggingface.co/datasets/shiroup/surtr_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 496.20 MiB | [Download](https://huggingface.co/datasets/shiroup/surtr_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1179 | 886.66 MiB | [Download](https://huggingface.co/datasets/shiroup/surtr_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='shiroup/surtr_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 21 | ![](samples\0\clu0-sample0.png) | ![](samples\0\clu0-sample1.png) | ![](samples\0\clu0-sample2.png) | ![](samples\0\clu0-sample3.png) | ![](samples\0\clu0-sample4.png) | 1girl, bare_shoulders, solo, black_dress, cleavage, detached_collar, looking_at_viewer, upper_body, chest_strap, off_shoulder, sleeveless_dress, simple_background, hair_intakes, closed_mouth, white_background, black_jacket, grey_background | | 1 | 28 | ![](samples\1\clu1-sample0.png) | ![](samples\1\clu1-sample1.png) | ![](samples\1\clu1-sample2.png) | ![](samples\1\clu1-sample3.png) | ![](samples\1\clu1-sample4.png) | 1girl, black_dress, solo, bare_shoulders, looking_at_viewer, black_thighhighs, off_shoulder, cleavage, black_jacket, demon_girl, chest_strap, detached_collar, holding_sword, black_footwear, full_body, open_jacket, high_heels, molten_rock, standing, long_sleeves | | 2 | 6 | ![](samples\2\clu2-sample0.png) | ![](samples\2\clu2-sample1.png) | ![](samples\2\clu2-sample2.png) | ![](samples\2\clu2-sample3.png) | ![](samples\2\clu2-sample4.png) | 1girl, bare_shoulders, black_dress, black_jacket, black_thighhighs, cleavage, cowboy_shot, looking_at_viewer, off_shoulder, solo, chest_strap, detached_collar, smile, open_clothes, simple_background, white_background, closed_mouth, long_sleeves | | 3 | 10 | ![](samples\3\clu3-sample0.png) | ![](samples\3\clu3-sample1.png) | ![](samples\3\clu3-sample2.png) | ![](samples\3\clu3-sample3.png) | ![](samples\3\clu3-sample4.png) | 1girl, bare_shoulders, black_bikini, cleavage, detached_sleeves, hair_rings, long_sleeves, looking_at_viewer, navel, official_alternate_costume, side-tie_bikini_bottom, solo, star_hair_ornament, stomach, criss-cross_halter, swimsuit_cover-up, thigh_strap, closed_mouth, black_choker, holding_food, sitting, thighs | | 4 | 11 | ![](samples\4\clu4-sample0.png) | ![](samples\4\clu4-sample1.png) | ![](samples\4\clu4-sample2.png) | ![](samples\4\clu4-sample3.png) | ![](samples\4\clu4-sample4.png) | 1girl, bare_shoulders, black_bikini, cleavage, criss-cross_halter, day, detached_sleeves, long_sleeves, looking_at_viewer, navel, official_alternate_costume, outdoors, solo, blue_sky, star_hair_ornament, stomach, thigh_strap, closed_mouth, side-tie_bikini_bottom, cowboy_shot, hair_rings, swimsuit_cover-up, thighs, black_choker, arm_up, armpits, standing | | 5 | 5 | ![](samples\5\clu5-sample0.png) | ![](samples\5\clu5-sample1.png) | ![](samples\5\clu5-sample2.png) | ![](samples\5\clu5-sample3.png) | ![](samples\5\clu5-sample4.png) | 1girl, bare_shoulders, black_bikini, cherry, cleavage, criss-cross_halter, day, detached_sleeves, hair_intakes, long_sleeves, looking_at_viewer, mouth_hold, navel, official_alternate_costume, side-tie_bikini_bottom, solo, stomach, armpits, black_ribbon, blue_sky, cowboy_shot, hair_rings, outdoors, swimsuit_cover-up, thigh_strap, arms_up, black_choker, closed_mouth, ocean, petals, cloud, holding, smile, star_hair_ornament, water | | 6 | 6 | ![](samples\6\clu6-sample0.png) | ![](samples\6\clu6-sample1.png) | ![](samples\6\clu6-sample2.png) | ![](samples\6\clu6-sample3.png) | ![](samples\6\clu6-sample4.png) | 1girl, black_belt, black_choker, black_shirt, cowboy_shot, crop_top, cross_necklace, looking_at_viewer, midriff, miniskirt, official_alternate_costume, pencil_skirt, red_skirt, short_sleeves, solo, thigh_strap, closed_mouth, navel, arm_up, black_gloves, single_glove, standing, buckle, stomach, sword, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | black_dress | cleavage | detached_collar | looking_at_viewer | upper_body | chest_strap | off_shoulder | sleeveless_dress | simple_background | hair_intakes | closed_mouth | white_background | black_jacket | grey_background | black_thighhighs | demon_girl | holding_sword | black_footwear | full_body | open_jacket | high_heels | molten_rock | standing | long_sleeves | cowboy_shot | smile | open_clothes | black_bikini | detached_sleeves | hair_rings | navel | official_alternate_costume | side-tie_bikini_bottom | star_hair_ornament | stomach | criss-cross_halter | swimsuit_cover-up | thigh_strap | black_choker | holding_food | sitting | thighs | day | outdoors | blue_sky | arm_up | armpits | cherry | mouth_hold | black_ribbon | arms_up | ocean | petals | cloud | holding | water | black_belt | black_shirt | crop_top | cross_necklace | midriff | miniskirt | pencil_skirt | red_skirt | short_sleeves | black_gloves | single_glove | buckle | sword | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:-----------|:------------------|:--------------------|:-------------|:--------------|:---------------|:-------------------|:--------------------|:---------------|:---------------|:-------------------|:---------------|:------------------|:-------------------|:-------------|:----------------|:-----------------|:------------|:--------------|:-------------|:--------------|:-----------|:---------------|:--------------|:--------|:---------------|:---------------|:-------------------|:-------------|:--------|:-----------------------------|:-------------------------|:---------------------|:----------|:---------------------|:--------------------|:--------------|:---------------|:---------------|:----------|:---------|:------|:-----------|:-----------|:---------|:----------|:---------|:-------------|:---------------|:----------|:--------|:---------|:--------|:----------|:--------|:-------------|:--------------|:-----------|:-----------------|:----------|:------------|:---------------|:------------|:----------------|:---------------|:---------------|:---------|:--------| | 0 | 21 | ![](samples\0\clu0-sample0.png) | ![](samples\0\clu0-sample1.png) | ![](samples\0\clu0-sample2.png) | ![](samples\0\clu0-sample3.png) | ![](samples\0\clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 28 | ![](samples\1\clu1-sample0.png) | ![](samples\1\clu1-sample1.png) | ![](samples\1\clu1-sample2.png) | ![](samples\1\clu1-sample3.png) | ![](samples\1\clu1-sample4.png) | X | X | X | X | X | X | X | | X | X | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples\2\clu2-sample0.png) | ![](samples\2\clu2-sample1.png) | ![](samples\2\clu2-sample2.png) | ![](samples\2\clu2-sample3.png) | ![](samples\2\clu2-sample4.png) | X | X | X | X | X | X | X | | X | X | | X | | X | X | X | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 10 | ![](samples\3\clu3-sample0.png) | ![](samples\3\clu3-sample1.png) | ![](samples\3\clu3-sample2.png) | ![](samples\3\clu3-sample3.png) | ![](samples\3\clu3-sample4.png) | X | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 11 | ![](samples\4\clu4-sample0.png) | ![](samples\4\clu4-sample1.png) | ![](samples\4\clu4-sample2.png) | ![](samples\4\clu4-sample3.png) | ![](samples\4\clu4-sample4.png) | X | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples\5\clu5-sample0.png) | ![](samples\5\clu5-sample1.png) | ![](samples\5\clu5-sample2.png) | ![](samples\5\clu5-sample3.png) | ![](samples\5\clu5-sample4.png) | X | X | X | | X | | X | | | | | | X | X | | | | | | | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 6 | 6 | ![](samples\6\clu6-sample0.png) | ![](samples\6\clu6-sample1.png) | ![](samples\6\clu6-sample2.png) | ![](samples\6\clu6-sample3.png) | ![](samples\6\clu6-sample4.png) | X | | X | | | | X | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | X | X | | | X | | | X | X | | | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
psyche/kowiki
--- language: - ko license: - apache-2.0 dataset_info: features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 1142558231.8083806 num_examples: 531002 - name: validation num_bytes: 126952588.19161937 num_examples: 59001 download_size: 742445023 dataset_size: 1269510820.0 ---
autoevaluate/autoeval-eval-phpthinh__examplei-match-bd10ea-1748761025
--- type: predictions tags: - autotrain - evaluation datasets: - phpthinh/examplei eval_info: task: text_zero_shot_classification model: bigscience/bloom-1b7 metrics: ['f1'] dataset_name: phpthinh/examplei dataset_config: match dataset_split: test col_mapping: text: text classes: classes target: target --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: bigscience/bloom-1b7 * Dataset: phpthinh/examplei * Config: match * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model.
joey234/mmlu-medical_genetics-rule-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_prompt dtype: string splits: - name: test num_bytes: 43760 num_examples: 100 download_size: 30190 dataset_size: 43760 --- # Dataset Card for "mmlu-medical_genetics-rule-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rcds/swiss_judgment_prediction
--- pretty_name: Swiss-Judgment-Prediction annotations_creators: - found language_creators: - found language: - de - fr - it - en license: - cc-by-sa-4.0 multilinguality: - multilingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - text-classification task_ids: [] tags: - judgement-prediction dataset_info: - config_name: de features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 104270719 num_examples: 35458 - name: validation num_bytes: 12131878 num_examples: 4705 - name: test num_bytes: 26056177 num_examples: 9725 download_size: 1000382331 dataset_size: 142458774 - config_name: fr features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 96807957 num_examples: 21179 - name: validation num_bytes: 13031904 num_examples: 3095 - name: test num_bytes: 33318359 num_examples: 6820 download_size: 1000382331 dataset_size: 143158220 - config_name: it features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 10773516 num_examples: 3072 - name: validation num_bytes: 1045551 num_examples: 408 - name: test num_bytes: 2474761 num_examples: 812 download_size: 1000382331 dataset_size: 14293828 - config_name: mt_de features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 106990696 num_examples: 24251 - name: validation - name: test download_size: 1000382331 dataset_size: 106990696 - config_name: mt_fr features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 117932134 num_examples: 38524 - name: validation - name: test download_size: 1000382331 dataset_size: 117932134 - config_name: mt_it features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 201749076 num_examples: 56631 - name: validation - name: test download_size: 1000382331 dataset_size: 201749076 - config_name: mt_en features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 196352783 num_examples: 59703 - name: validation - name: test download_size: 1000382331 dataset_size: 196352783 - config_name: all features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 211852192 num_examples: 59709 - name: validation num_bytes: 26209333 num_examples: 8208 - name: test num_bytes: 61849297 num_examples: 17357 download_size: 1000382331 dataset_size: 299910822 - config_name: all+mt features: - name: id dtype: int32 - name: year dtype: int32 - name: text dtype: string - name: label dtype: class_label: names: '0': dismissal '1': approval - name: language dtype: string - name: region dtype: string - name: canton dtype: string - name: legal area dtype: string - name: source_language dtype: string splits: - name: train num_bytes: 834876881 num_examples: 238818 - name: validation num_bytes: 26209333 num_examples: 8208 - name: test num_bytes: 61849297 num_examples: 17357 download_size: 1000382331 dataset_size: 922935511 --- # Dataset Card for "SwissJudgmentPrediction" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://github.com/JoelNiklaus/SwissCourtRulingCorpus - **Repository:** https://github.com/JoelNiklaus/SwissCourtRulingCorpus - **Paper:** https://arxiv.org/abs/2110.00806 - **Leaderboard:** N/A - **Point of Contact:** [Joel Niklaus](mailto:joel.niklaus@inf.unibe.ch) ### Dataset Summary **Documents** Swiss-Judgment-Prediction is a multilingual, diachronic dataset of 85K Swiss Federal Supreme Court (FSCS) cases annotated with the respective binarized judgment outcome (approval/dismissal), posing a challenging text classification task. We also provide additional metadata, i.e., the publication year, the legal area and the canton of origin per case, to promote robustness and fairness studies on the critical area of legal NLP. ### Supported Tasks and Leaderboards SwissJudgmentPrediction can be used for the legal judgment prediction task. The dataset is not yet part of an established benchmark. ### Languages Switzerland has four official languages with 3 languages (German, French and Italian) being represented in more than 1000 Swiss Federal Supreme court decisions. The decisions are written by the judges and clerks in the language of the proceedings. ## Dataset Structure In version 2 we added machine translated data using [EasyNMT](https://github.com/UKPLab/EasyNMT) for all documents into German, French, Italian and English as an additional training set. ### Data Instances **Multilingual use of the dataset** When the dataset is used in a multilingual setting selecting the the 'all_languages' flag: ```python from datasets import load_dataset dataset = load_dataset('swiss_judgment_prediction', 'all_languages') ``` ``` { "id": 48757, "year": 2015, "facts": "Sachverhalt: A. X._ war bei der Krankenversicherung C._ taggeldversichert. Infolge einer Arbeitsunf\u00e4higkeit leistete ihm die C._ vom 30. Juni 2011 bis am 28. Juni 2013 Krankentaggelder, wobei die Leistungen bis am 30. September 2012 auf Grundlage einer Arbeitsunf\u00e4higkeit von 100% und danach basierend auf einer Arbeitsunf\u00e4higkeit von 55% erbracht wurden. Die Neueinsch\u00e4tzung der Arbeitsf\u00e4higkeit erfolgte anhand eines Gutachtens der D._ AG vom 27. August 2012, welches im Auftrag der C._ erstellt wurde. X._ machte daraufhin gegen\u00fcber der C._ geltend, er sei entgegen dem Gutachten auch nach dem 30. September 2012 zu 100% arbeitsunf\u00e4hig gewesen. Ferner verlangte er von der D._ AG zwecks externer \u00dcberpr\u00fcfung des Gutachtens die Herausgabe s\u00e4mtlicher diesbez\u00fcglicher Notizen, Auswertungen und Unterlagen. A._ (als Gesch\u00e4ftsf\u00fchrer der D._ AG) und B._ (als f\u00fcr das Gutachten medizinisch Verantwortliche) antworteten ihm, dass sie alle Unterlagen der C._ zugestellt h\u00e4tten und dass allf\u00e4llige Fragen zum Gutachten direkt der C._ zu stellen seien. X._ reichte am 2. Januar 2014 eine Strafanzeige gegen A._ und B._ ein. Er wirft diesen vor, ihn durch die Nichtherausgabe der Dokumente und durch Behinderung des IV-Verfahrens gen\u00f6tigt, Daten besch\u00e4digt bzw. vernichtet und ein falsches \u00e4rztliches Zeugnis ausgestellt zu haben. Zudem h\u00e4tten sie durch die Verz\u00f6gerung des IV-Verfahrens und insbesondere durch das falsche \u00e4rztliche Zeugnis sein Verm\u00f6gen arglistig gesch\u00e4digt. B. Die Staatsanwaltschaft des Kantons Bern, Region Oberland, nahm das Verfahren wegen N\u00f6tigung, Datenbesch\u00e4digung, falschem \u00e4rztlichem Zeugnis und arglistiger Verm\u00f6genssch\u00e4digung mit Verf\u00fcgung vom 10. November 2014 nicht an die Hand. Das Obergericht des Kantons Bern wies die von X._ dagegen erhobene Beschwerde am 27. April 2015 ab, soweit darauf einzutreten war. C. X._ beantragt mit Beschwerde in Strafsachen, der Beschluss vom 27. April 2015 sei aufzuheben und die Angelegenheit zur korrekten Ermittlung des Sachverhalts an die Staatsanwaltschaft zur\u00fcckzuweisen. Er stellt zudem den sinngem\u00e4ssen Antrag, das bundesgerichtliche Verfahren sei w\u00e4hrend der Dauer des konnexen Strafverfahrens gegen eine Teilgutachterin und des ebenfalls konnexen Zivil- oder Strafverfahrens gegen die C._ wegen Einsichtsverweigerung in das mutmasslich gef\u00e4lschte Originalgutachten zu sistieren. X._ ersucht um unentgeltliche Rechtspflege. ", "labels": 0, # dismissal "language": "de", "region": "Espace Mittelland", "canton": "be", "legal area": "penal law" } ``` **Monolingual use of the dataset** When the dataset is used in a monolingual setting selecting the ISO language code for one of the 3 supported languages. For example: ```python from datasets import load_dataset dataset = load_dataset('swiss_judgment_prediction', 'de') ``` ``` { "id": 48757, "year": 2015, "facts": "Sachverhalt: A. X._ war bei der Krankenversicherung C._ taggeldversichert. Infolge einer Arbeitsunf\u00e4higkeit leistete ihm die C._ vom 30. Juni 2011 bis am 28. Juni 2013 Krankentaggelder, wobei die Leistungen bis am 30. September 2012 auf Grundlage einer Arbeitsunf\u00e4higkeit von 100% und danach basierend auf einer Arbeitsunf\u00e4higkeit von 55% erbracht wurden. Die Neueinsch\u00e4tzung der Arbeitsf\u00e4higkeit erfolgte anhand eines Gutachtens der D._ AG vom 27. August 2012, welches im Auftrag der C._ erstellt wurde. X._ machte daraufhin gegen\u00fcber der C._ geltend, er sei entgegen dem Gutachten auch nach dem 30. September 2012 zu 100% arbeitsunf\u00e4hig gewesen. Ferner verlangte er von der D._ AG zwecks externer \u00dcberpr\u00fcfung des Gutachtens die Herausgabe s\u00e4mtlicher diesbez\u00fcglicher Notizen, Auswertungen und Unterlagen. A._ (als Gesch\u00e4ftsf\u00fchrer der D._ AG) und B._ (als f\u00fcr das Gutachten medizinisch Verantwortliche) antworteten ihm, dass sie alle Unterlagen der C._ zugestellt h\u00e4tten und dass allf\u00e4llige Fragen zum Gutachten direkt der C._ zu stellen seien. X._ reichte am 2. Januar 2014 eine Strafanzeige gegen A._ und B._ ein. Er wirft diesen vor, ihn durch die Nichtherausgabe der Dokumente und durch Behinderung des IV-Verfahrens gen\u00f6tigt, Daten besch\u00e4digt bzw. vernichtet und ein falsches \u00e4rztliches Zeugnis ausgestellt zu haben. Zudem h\u00e4tten sie durch die Verz\u00f6gerung des IV-Verfahrens und insbesondere durch das falsche \u00e4rztliche Zeugnis sein Verm\u00f6gen arglistig gesch\u00e4digt. B. Die Staatsanwaltschaft des Kantons Bern, Region Oberland, nahm das Verfahren wegen N\u00f6tigung, Datenbesch\u00e4digung, falschem \u00e4rztlichem Zeugnis und arglistiger Verm\u00f6genssch\u00e4digung mit Verf\u00fcgung vom 10. November 2014 nicht an die Hand. Das Obergericht des Kantons Bern wies die von X._ dagegen erhobene Beschwerde am 27. April 2015 ab, soweit darauf einzutreten war. C. X._ beantragt mit Beschwerde in Strafsachen, der Beschluss vom 27. April 2015 sei aufzuheben und die Angelegenheit zur korrekten Ermittlung des Sachverhalts an die Staatsanwaltschaft zur\u00fcckzuweisen. Er stellt zudem den sinngem\u00e4ssen Antrag, das bundesgerichtliche Verfahren sei w\u00e4hrend der Dauer des konnexen Strafverfahrens gegen eine Teilgutachterin und des ebenfalls konnexen Zivil- oder Strafverfahrens gegen die C._ wegen Einsichtsverweigerung in das mutmasslich gef\u00e4lschte Originalgutachten zu sistieren. X._ ersucht um unentgeltliche Rechtspflege. ", "labels": 0, # dismissal "language": "de", "region": "Espace Mittelland", "canton": "be", "legal area": "penal law" } ``` ### Data Fields **Multilingual use of the dataset** The following data fields are provided for documents (`train`, `validation`, `test`): `id`: (**int**) a unique identifier of the for the document \ `year`: (**int**) the publication year \ `text`: (**str**) the facts of the case \ `label`: (**class label**) the judgment outcome: 0 (dismissal) or 1 (approval) \ `language`: (**str**) one of (de, fr, it) \ `region`: (**str**) the region of the lower court \ `canton`: (**str**) the canton of the lower court \ `legal area`: (**str**) the legal area of the case **Monolingual use of the dataset** The following data fields are provided for documents (`train`, `validation`, `test`): `id`: (**int**) a unique identifier of the for the document \ `year`: (**int**) the publication year \ `text`: (**str**) the facts of the case \ `label`: (**class label**) the judgment outcome: 0 (dismissal) or 1 (approval) \ `language`: (**str**) one of (de, fr, it) \ `region`: (**str**) the region of the lower court \ `canton`: (**str**) the canton of the lower court \ `legal area`: (**str**) the legal area of the case ### Data Splits | Language | Subset | Number of Documents (Training/Validation/Test) | |------------|------------|------------------------------------------------| | German | **de** | 35'452 / 4'705 / 9'725 | | French | **fr** | 21'179 / 3'095 / 6'820 | | Italian | **it** | 3'072 / 408 / 812 | | All | **all** | 59'709 / 8'208 / 17'357 | | MT German | **mt_de** | 24'251 / 0 / 0 | | MT French | **mt_fr** | 38'524 / 0 / 0 | | MT Italian | **mt_it** | 56'631 / 0 / 0 | | MT All | **all+mt** | 238'818 / 8'208 / 17'357 | ## Dataset Creation ### Curation Rationale The dataset was curated by Niklaus et al. (2021). ### Source Data #### Initial Data Collection and Normalization The original data are available at the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML. #### Who are the source language producers? Switzerland has four official languages with 3 languages (German, French and Italian) being represented in more than 1000 Swiss Federal Supreme court decisions. The decisions are written by the judges and clerks in the language of the proceedings. ### Annotations #### Annotation process The decisions have been annotated with the binarized judgment outcome using parsers and regular expressions. #### Who are the annotators? Joel Niklaus and Adrian JΓΆrg annotated the binarized judgment outcomes. Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch). ### Personal and Sensitive Information The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators Niklaus et al. (2021) ### Licensing Information We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf) Β© Swiss Federal Supreme Court, 2000-2020 The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made. Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf ### Citation Information *Joel Niklaus, Ilias Chalkidis, and Matthias StΓΌrmer.* *Swiss-Judgment-Prediction: A Multilingual Legal Judgment Prediction Benchmark* *Proceedings of the 2021 Natural Legal Language Processing Workshop. Punta Cana, Dominican Republic. 2021* ``` @InProceedings{niklaus-etal-2021-swiss, author = {Niklaus, Joel and Chalkidis, Ilias and StΓΌrmer, Matthias}, title = {Swiss-Judgment-Prediction: A Multilingual Legal Judgment Prediction Benchmark}, booktitle = {Proceedings of the 2021 Natural Legal Language Processing Workshop}, year = {2021}, location = {Punta Cana, Dominican Republic}, } ``` and the new citation ``` @misc{niklaus2022empirical, title={An Empirical Study on Cross-X Transfer for Legal Judgment Prediction}, author={Joel Niklaus and Matthias StΓΌrmer and Ilias Chalkidis}, year={2022}, eprint={2209.12325}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ### Contributions Thanks to [@joelniklaus](https://github.com/joelniklaus) for adding this dataset.
tpremoli/CelebA-attrs-20k
--- dataset_info: features: - name: image dtype: image - name: 5_o_Clock_Shadow dtype: int64 - name: Arched_Eyebrows dtype: int64 - name: Attractive dtype: int64 - name: Bags_Under_Eyes dtype: int64 - name: Bald dtype: int64 - name: Bangs dtype: int64 - name: Big_Lips dtype: int64 - name: Big_Nose dtype: int64 - name: Black_Hair dtype: int64 - name: Blond_Hair dtype: int64 - name: Blurry dtype: int64 - name: Brown_Hair dtype: int64 - name: Bushy_Eyebrows dtype: int64 - name: Chubby dtype: int64 - name: Double_Chin dtype: int64 - name: Eyeglasses dtype: int64 - name: Goatee dtype: int64 - name: Gray_Hair dtype: int64 - name: Heavy_Makeup dtype: int64 - name: High_Cheekbones dtype: int64 - name: Male dtype: int64 - name: Mouth_Slightly_Open dtype: int64 - name: Mustache dtype: int64 - name: Narrow_Eyes dtype: int64 - name: No_Beard dtype: int64 - name: Oval_Face dtype: int64 - name: Pale_Skin dtype: int64 - name: Pointy_Nose dtype: int64 - name: Receding_Hairline dtype: int64 - name: Rosy_Cheeks dtype: int64 - name: Sideburns dtype: int64 - name: Smiling dtype: int64 - name: Straight_Hair dtype: int64 - name: Wavy_Hair dtype: int64 - name: Wearing_Earrings dtype: int64 - name: Wearing_Hat dtype: int64 - name: Wearing_Lipstick dtype: int64 - name: Wearing_Necklace dtype: int64 - name: Wearing_Necktie dtype: int64 - name: Young dtype: int64 - name: prompt_string dtype: string splits: - name: train num_bytes: 148170817.19 num_examples: 19999 - name: validation num_bytes: 18181812.67 num_examples: 2451 - name: test num_bytes: 18265274.4 num_examples: 2440 download_size: 175240272 dataset_size: 184617904.26000002 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* --- # CelebA-128x128 CelebA with attrs at 128x128 resolution. ## Dataset Information The attributes are binary attributes. The dataset is already split into train/test/validation sets. This dataset has been reduced so there's 20k train samples. ## Citation ```bibtex @inproceedings{liu2015faceattributes, title = {Deep Learning Face Attributes in the Wild}, author = {Liu, Ziwei and Luo, Ping and Wang, Xiaogang and Tang, Xiaoou}, booktitle = {Proceedings of International Conference on Computer Vision (ICCV)}, month = {December}, year = {2015} } ```
CyberHarem/satou_masuki_bangdreamdai2ki
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of satou_masuki/佐藀ますき (BanG Dream! Dai 2-ki) This is the dataset of satou_masuki/佐藀ますき (BanG Dream! Dai 2-ki), containing 63 images and their tags. The core tags of this character are `blonde_hair, short_hair, yellow_eyes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 63 | 59.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdreamdai2ki/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 63 | 45.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdreamdai2ki/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 126 | 83.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdreamdai2ki/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 63 | 56.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdreamdai2ki/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 126 | 103.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satou_masuki_bangdreamdai2ki/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/satou_masuki_bangdreamdai2ki', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, crop_top, solo, midriff, looking_at_viewer, holding, navel, shirt, fingerless_gloves, earrings, open_jacket, simple_background, white_background, black_gloves, black_skirt, breasts, drumsticks, full_body, long_sleeves, red_jacket, smile | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, solo, jacket, shirt, simple_background, upper_body, white_background, blush | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | crop_top | solo | midriff | looking_at_viewer | holding | navel | shirt | fingerless_gloves | earrings | open_jacket | simple_background | white_background | black_gloves | black_skirt | breasts | drumsticks | full_body | long_sleeves | red_jacket | smile | jacket | upper_body | blush | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:----------|:--------------------|:----------|:--------|:--------|:--------------------|:-----------|:--------------|:--------------------|:-------------------|:---------------|:--------------|:----------|:-------------|:------------|:---------------|:-------------|:--------|:---------|:-------------|:--------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | | | X | | | | X | X | | | | | | | | | X | X | X |
huggingface-legal/takedown-notices
--- license: cc-by-nc-nd-4.0 tags: - legal --- ### Takedown notices received by the Hugging Face team Please click on Files and versions to browse them Also check out our: - [Terms of Service](https://huggingface.co/terms-of-service) - [Community Code of Conduct](https://huggingface.co/code-of-conduct) - [Content Guidelines](https://huggingface.co/content-guidelines)
cakiki/token-graph
--- license: apache-2.0 ---
cj-mills/pexels-110k-768p-min-jpg-depth-anything-large-hf
--- license: cc0-1.0 --- This dataset contains depth maps for the all images in the Pexels 110k 768p JPEG dataset. - **Source Dataset:** [Pexels 110k 768p JPEG](https://www.kaggle.com/datasets/innominate817/pexels-110k-768p-min-jpg?select=pexels-110k-768p-min-jpg) ### Sample Image with Depth Map ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6240f5429d7bad9474e7ef39/Jx7ks9dCaK2kkBigmsX6y.png) ## Metadata ### Authors | Author Name | Bio | | --------------- | ---- | | Christian Mills | | ### Coverage | Temporal Coverage Start Date | Temporal Coverage End Date | Geospatial Coverage | | ---------------------------- | -------------------------- | ------------------- | | 08/11/2013 | 11/06/2019 | Worldwide | ### Provenance | Sources | Collection Methodology | | ------------------------------------------------------------ | ------------------------------------------------------------ | | https://www.kaggle.com/datasets/innominate817/pexels-110k-768p-min-jpg?select=pexels-110k-768p-min-jpg | I used the Depth Anything Large model to generate the depth maps. | ### Expected Update Frequency * Never
Mandala1/webelements
--- dataset_info: features: - name: image dtype: image splits: - name: test num_bytes: 24406.0 num_examples: 1 download_size: 21437 dataset_size: 24406.0 configs: - config_name: default data_files: - split: test path: data/test-* ---
wantoo12345/1
--- license: bigcode-openrail-m ---
RadicalRendy/jakdataset
--- license: openrail ---
marup/ConorMcGregorRVC200Epochs
--- license: openrail ---
open-llm-leaderboard/details_BEE-spoke-data__NanoLlama-GQA-L10-A32_KV8-v13-KI
--- pretty_name: Evaluation run of BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI](https://huggingface.co/BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 1 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__NanoLlama-GQA-L10-A32_KV8-v13-KI\"\ ,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\ \ are the [latest results from run 2023-12-02T14:06:00.673559](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__NanoLlama-GQA-L10-A32_KV8-v13-KI/blob/main/results_2023-12-02T14-06-00.673559.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.009097801364670205,\n\ \ \"acc_stderr\": 0.002615326510775673\n },\n \"harness|gsm8k|5\":\ \ {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n\ \ }\n}\n```" repo_url: https://huggingface.co/BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_gsm8k_5 data_files: - split: 2023_12_02T14_06_00.673559 path: - '**/details_harness|gsm8k|5_2023-12-02T14-06-00.673559.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-02T14-06-00.673559.parquet' - config_name: results data_files: - split: 2023_12_02T14_06_00.673559 path: - results_2023-12-02T14-06-00.673559.parquet - split: latest path: - results_2023-12-02T14-06-00.673559.parquet --- # Dataset Card for Evaluation run of BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI](https://huggingface.co/BEE-spoke-data/NanoLlama-GQA-L10-A32_KV8-v13-KI) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__NanoLlama-GQA-L10-A32_KV8-v13-KI", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-02T14:06:00.673559](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__NanoLlama-GQA-L10-A32_KV8-v13-KI/blob/main/results_2023-12-02T14-06-00.673559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775673 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775673 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni
--- pretty_name: Evaluation run of Weyaxi/ChatAYT-Lora-Assamble-Marcoroni dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Weyaxi/ChatAYT-Lora-Assamble-Marcoroni](https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T13:33:47.797770](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni/blob/main/results_2023-10-24T13-33-47.797770.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008598993288590604,\n\ \ \"em_stderr\": 0.0009455579144542189,\n \"f1\": 0.1045532718120813,\n\ \ \"f1_stderr\": 0.0020198084132137728,\n \"acc\": 0.43109211314448,\n\ \ \"acc_stderr\": 0.009797803895878525\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.008598993288590604,\n \"em_stderr\": 0.0009455579144542189,\n\ \ \"f1\": 0.1045532718120813,\n \"f1_stderr\": 0.0020198084132137728\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \ \ \"acc_stderr\": 0.007831458737058703\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698346\n\ \ }\n}\n```" repo_url: https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|arc:challenge|25_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-14T08-39-51.722063.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T13_33_47.797770 path: - '**/details_harness|drop|3_2023-10-24T13-33-47.797770.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T13-33-47.797770.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T13_33_47.797770 path: - '**/details_harness|gsm8k|5_2023-10-24T13-33-47.797770.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T13-33-47.797770.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hellaswag|10_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T08-39-51.722063.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_14T08_39_51.722063 path: - '**/details_harness|truthfulqa:mc|0_2023-09-14T08-39-51.722063.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-14T08-39-51.722063.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T13_33_47.797770 path: - '**/details_harness|winogrande|5_2023-10-24T13-33-47.797770.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T13-33-47.797770.parquet' - config_name: results data_files: - split: 2023_09_14T08_39_51.722063 path: - results_2023-09-14T08-39-51.722063.parquet - split: 2023_10_24T13_33_47.797770 path: - results_2023-10-24T13-33-47.797770.parquet - split: latest path: - results_2023-10-24T13-33-47.797770.parquet --- # Dataset Card for Evaluation run of Weyaxi/ChatAYT-Lora-Assamble-Marcoroni ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Weyaxi/ChatAYT-Lora-Assamble-Marcoroni](https://huggingface.co/Weyaxi/ChatAYT-Lora-Assamble-Marcoroni) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T13:33:47.797770](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__ChatAYT-Lora-Assamble-Marcoroni/blob/main/results_2023-10-24T13-33-47.797770.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.008598993288590604, "em_stderr": 0.0009455579144542189, "f1": 0.1045532718120813, "f1_stderr": 0.0020198084132137728, "acc": 0.43109211314448, "acc_stderr": 0.009797803895878525 }, "harness|drop|3": { "em": 0.008598993288590604, "em_stderr": 0.0009455579144542189, "f1": 0.1045532718120813, "f1_stderr": 0.0020198084132137728 }, "harness|gsm8k|5": { "acc": 0.0887035633055345, "acc_stderr": 0.007831458737058703 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698346 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Dans-DiscountModels__ShearedLlama-1.3b-FFT-Test1
--- pretty_name: Evaluation run of Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1](https://huggingface.co/Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__ShearedLlama-1.3b-FFT-Test1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-16T16:48:32.106245](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__ShearedLlama-1.3b-FFT-Test1/blob/main/results_2023-12-16T16-48-32.106245.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26194360492974045,\n\ \ \"acc_stderr\": 0.031003587918478445,\n \"acc_norm\": 0.26393590768678044,\n\ \ \"acc_norm_stderr\": 0.0318032795070849,\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3696758746746233,\n\ \ \"mc2_stderr\": 0.013710142031833798\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.2935153583617747,\n \"acc_stderr\": 0.013307250444941122,\n\ \ \"acc_norm\": 0.3267918088737201,\n \"acc_norm_stderr\": 0.013706665975587336\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4500099581756622,\n\ \ \"acc_stderr\": 0.004964779805180661,\n \"acc_norm\": 0.5998805018920533,\n\ \ \"acc_norm_stderr\": 0.0048892106289079775\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\ \ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\ \ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312338,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312338\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\ \ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \ \ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\ \ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\ \ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\ \ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\ \ \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n\ \ \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\ \ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n\ \ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\ \ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\ \ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\ \ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\ \ \"acc_stderr\": 0.03129843185743809,\n \"acc_norm\": 0.14285714285714285,\n\ \ \"acc_norm_stderr\": 0.03129843185743809\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\ \ \"acc_stderr\": 0.02499305339776482,\n \"acc_norm\": 0.26129032258064516,\n\ \ \"acc_norm_stderr\": 0.02499305339776482\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.031947400722655415,\n\ \ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.031947400722655415\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\ \ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\ acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775295,\n\ \ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775295\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222717,\n\ \ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222717\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073845,\n \ \ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073845\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341937,\n\ \ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341937\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\ acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"\ acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258536,\n \"\ acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258536\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\ acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.25738396624472576,\n \"acc_stderr\": 0.0284588209914603,\n \ \ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.0284588209914603\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\ \ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.33183856502242154,\n\ \ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\ acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\ \ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\ \ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\ \ \"acc_stderr\": 0.0443280405529152,\n \"acc_norm\": 0.32142857142857145,\n\ \ \"acc_norm_stderr\": 0.0443280405529152\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\ \ \"acc_stderr\": 0.0302363899421731,\n \"acc_norm\": 0.3076923076923077,\n\ \ \"acc_norm_stderr\": 0.0302363899421731\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n\ \ \"acc_stderr\": 0.015982814774695625,\n \"acc_norm\": 0.27586206896551724,\n\ \ \"acc_norm_stderr\": 0.015982814774695625\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\ \ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\ \ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\ \ \"acc_stderr\": 0.025218040373410626,\n \"acc_norm\": 0.27009646302250806,\n\ \ \"acc_norm_stderr\": 0.025218040373410626\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\ \ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n \ \ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\ \ \"acc_stderr\": 0.010956556654417346,\n \"acc_norm\": 0.24315514993481094,\n\ \ \"acc_norm_stderr\": 0.010956556654417346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\ \ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.27124183006535946,\n \"acc_stderr\": 0.017986615304030305,\n \ \ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.017986615304030305\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.025000256039546212,\n\ \ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.025000256039546212\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\ \ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.26865671641791045,\n\ \ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\ \ \"acc_stderr\": 0.036471685236832266,\n \"acc_norm\": 0.3253012048192771,\n\ \ \"acc_norm_stderr\": 0.036471685236832266\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691583,\n\ \ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691583\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3696758746746233,\n\ \ \"mc2_stderr\": 0.013710142031833798\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5872138910812944,\n \"acc_stderr\": 0.0138370606486821\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \ \ \"acc_stderr\": 0.0013121578148674168\n }\n}\n```" repo_url: https://huggingface.co/Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|arc:challenge|25_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-16T16-48-32.106245.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|gsm8k|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hellaswag|10_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-48-32.106245.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-management|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-48-32.106245.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|truthfulqa:mc|0_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-16T16-48-32.106245.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_16T16_48_32.106245 path: - '**/details_harness|winogrande|5_2023-12-16T16-48-32.106245.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-16T16-48-32.106245.parquet' - config_name: results data_files: - split: 2023_12_16T16_48_32.106245 path: - results_2023-12-16T16-48-32.106245.parquet - split: latest path: - results_2023-12-16T16-48-32.106245.parquet --- # Dataset Card for Evaluation run of Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1](https://huggingface.co/Dans-DiscountModels/ShearedLlama-1.3b-FFT-Test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__ShearedLlama-1.3b-FFT-Test1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-16T16:48:32.106245](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__ShearedLlama-1.3b-FFT-Test1/blob/main/results_2023-12-16T16-48-32.106245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26194360492974045, "acc_stderr": 0.031003587918478445, "acc_norm": 0.26393590768678044, "acc_norm_stderr": 0.0318032795070849, "mc1": 0.22766217870257038, "mc1_stderr": 0.014679255032111075, "mc2": 0.3696758746746233, "mc2_stderr": 0.013710142031833798 }, "harness|arc:challenge|25": { "acc": 0.2935153583617747, "acc_stderr": 0.013307250444941122, "acc_norm": 0.3267918088737201, "acc_norm_stderr": 0.013706665975587336 }, "harness|hellaswag|10": { "acc": 0.4500099581756622, "acc_stderr": 0.004964779805180661, "acc_norm": 0.5998805018920533, "acc_norm_stderr": 0.0048892106289079775 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3037037037037037, "acc_stderr": 0.03972552884785137, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.03972552884785137 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.03110318238312338, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.03110318238312338 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2679245283018868, "acc_stderr": 0.027257260322494845, "acc_norm": 0.2679245283018868, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2708333333333333, "acc_stderr": 0.03716177437566016, "acc_norm": 0.2708333333333333, "acc_norm_stderr": 0.03716177437566016 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.030631145539198823, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.030631145539198823 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.03873958714149351, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.03873958714149351 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33191489361702126, "acc_stderr": 0.030783736757745647, "acc_norm": 0.33191489361702126, "acc_norm_stderr": 0.030783736757745647 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2206896551724138, "acc_stderr": 0.03455930201924811, "acc_norm": 0.2206896551724138, "acc_norm_stderr": 0.03455930201924811 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.0220190800122179, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.0220190800122179 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.14285714285714285, "acc_stderr": 0.03129843185743809, "acc_norm": 0.14285714285714285, "acc_norm_stderr": 0.03129843185743809 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.26129032258064516, "acc_stderr": 0.02499305339776482, "acc_norm": 0.26129032258064516, "acc_norm_stderr": 0.02499305339776482 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.29064039408866993, "acc_stderr": 0.031947400722655415, "acc_norm": 0.29064039408866993, "acc_norm_stderr": 0.031947400722655415 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.03287666758603489, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.03287666758603489 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2222222222222222, "acc_stderr": 0.02962022787479047, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.02962022787479047 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.21761658031088082, "acc_stderr": 0.02977866303775295, "acc_norm": 0.21761658031088082, "acc_norm_stderr": 0.02977866303775295 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23076923076923078, "acc_stderr": 0.021362027725222717, "acc_norm": 0.23076923076923078, "acc_norm_stderr": 0.021362027725222717 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073845, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073845 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341937, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341937 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22568807339449543, "acc_stderr": 0.01792308766780306, "acc_norm": 0.22568807339449543, "acc_norm_stderr": 0.01792308766780306 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.17592592592592593, "acc_stderr": 0.025967420958258536, "acc_norm": 0.17592592592592593, "acc_norm_stderr": 0.025967420958258536 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591361, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25738396624472576, "acc_stderr": 0.0284588209914603, "acc_norm": 0.25738396624472576, "acc_norm_stderr": 0.0284588209914603 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.33183856502242154, "acc_stderr": 0.031602951437766785, "acc_norm": 0.33183856502242154, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2975206611570248, "acc_stderr": 0.04173349148083499, "acc_norm": 0.2975206611570248, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2777777777777778, "acc_stderr": 0.043300437496507437, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.043300437496507437 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22699386503067484, "acc_stderr": 0.03291099578615769, "acc_norm": 0.22699386503067484, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.0443280405529152, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.0443280405529152 }, "harness|hendrycksTest-management|5": { "acc": 0.22330097087378642, "acc_stderr": 0.04123553189891431, "acc_norm": 0.22330097087378642, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3076923076923077, "acc_stderr": 0.0302363899421731, "acc_norm": 0.3076923076923077, "acc_norm_stderr": 0.0302363899421731 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.27586206896551724, "acc_stderr": 0.015982814774695625, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.015982814774695625 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.014355911964767864, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.014355911964767864 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.26143790849673204, "acc_stderr": 0.025160998214292456, "acc_norm": 0.26143790849673204, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.27009646302250806, "acc_stderr": 0.025218040373410626, "acc_norm": 0.27009646302250806, "acc_norm_stderr": 0.025218040373410626 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.024477222856135114, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2978723404255319, "acc_stderr": 0.027281608344469414, "acc_norm": 0.2978723404255319, "acc_norm_stderr": 0.027281608344469414 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24315514993481094, "acc_stderr": 0.010956556654417346, "acc_norm": 0.24315514993481094, "acc_norm_stderr": 0.010956556654417346 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.24632352941176472, "acc_stderr": 0.02617343857052, "acc_norm": 0.24632352941176472, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27124183006535946, "acc_stderr": 0.017986615304030305, "acc_norm": 0.27124183006535946, "acc_norm_stderr": 0.017986615304030305 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3090909090909091, "acc_stderr": 0.044262946482000985, "acc_norm": 0.3090909090909091, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.025000256039546212, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.025000256039546212 }, "harness|hendrycksTest-sociology|5": { "acc": 0.26865671641791045, "acc_stderr": 0.03134328358208955, "acc_norm": 0.26865671641791045, "acc_norm_stderr": 0.03134328358208955 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-virology|5": { "acc": 0.3253012048192771, "acc_stderr": 0.036471685236832266, "acc_norm": 0.3253012048192771, "acc_norm_stderr": 0.036471685236832266 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.27485380116959063, "acc_stderr": 0.03424042924691583, "acc_norm": 0.27485380116959063, "acc_norm_stderr": 0.03424042924691583 }, "harness|truthfulqa:mc|0": { "mc1": 0.22766217870257038, "mc1_stderr": 0.014679255032111075, "mc2": 0.3696758746746233, "mc2_stderr": 0.013710142031833798 }, "harness|winogrande|5": { "acc": 0.5872138910812944, "acc_stderr": 0.0138370606486821 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674168 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ZhankuiHe/reddit_movie_raw
--- task_categories: - conversational language: - en tags: - recommendation viewer: false --- # Dataset Card for `Reddit-Movie-raw` ## Dataset Description - **Homepage:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Repository:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys - **Paper:** To appear - **Point of Contact:** zhh004@eng.ucsd.edu ### Dataset Summary This dataset provides the raw text from [Reddit](https://reddit.com) related to movie recommendation conversations. The dataset is extracted from the data dump of [pushshift.io](https://arxiv.org/abs/2001.08435) and only for research use. ### Disclaimer ⚠️ **Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.** ### Folder Structure We explain our data folder as follows: ```bash reddit_movie_raw β”œβ”€β”€ IMDB-database β”‚ β”œβ”€β”€ clean.py # script to obtain clean IMDB movie titles, which can be used for movie name matching if needed. β”‚ β”œβ”€β”€ movie_clean.tsv # results after movie title cleaning β”‚ β”œβ”€β”€ title.basics.tsv # original movie title information from IMDB β”‚ └── title.ratings.tsv # # original movie title and rating information from IMDB β”œβ”€β”€ Reddit-Movie-large β”‚ β”œβ”€β”€ sentences.jsonl # raw sentences from the subreddit/* data, it can be used for following processing β”‚ └── subreddit # raw text from different subreddits from Jan. 2012 to Dec. 2022 (large) β”‚ β”œβ”€β”€ bestofnetflix.jsonl β”‚ β”œβ”€β”€ movies.jsonl β”‚ β”œβ”€β”€ moviesuggestions.jsonl β”‚ β”œβ”€β”€ netflixbestof.jsonl β”‚ └── truefilm.jsonl └── Reddit-Movie-small β”œβ”€β”€ sentences.jsonl # raw sentences from the subreddit/* data, it can be used for following processing └── subreddit # raw text from different subreddits from Jan. 2022 to Dec. 2022 (small) β”œβ”€β”€ bestofnetflix.jsonl β”œβ”€β”€ movies.jsonl β”œβ”€β”€ moviesuggestions.jsonl β”œβ”€β”€ netflixbestof.jsonl └── truefilm.jsonl ``` ### Data Processing We also provide first-version processed Reddit-Movie datasets as [Reddit-Movie-small-V1]() and [Reddit-Movie-large-V1](). Join us if you want to improve the processing quality as well! ### Citation Information Please cite these two papers if you used this raw data, thanks! ```bib @inproceedings{baumgartner2020pushshift, title={The pushshift reddit dataset}, author={Baumgartner, Jason and Zannettou, Savvas and Keegan, Brian and Squire, Megan and Blackburn, Jeremy}, booktitle={Proceedings of the international AAAI conference on web and social media}, volume={14}, pages={830--839}, year={2020} } ``` ```bib @inproceedings{he23large, title = Large language models as zero-shot conversational recommenders", author = "Zhankui He and Zhouhang Xie and Rahul Jha and Harald Steck and Dawen Liang and Yesu Feng and Bodhisattwa Majumder and Nathan Kallus and Julian McAuley", year = "2023", booktitle = "CIKM" } ``` Please contact [Zhankui He](https://aaronheee.github.io) if you have any questions or suggestions.
leoleoasd/learningq_ted_ed
--- license: unknown configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: context dtype: string - name: questionsrc dtype: string - name: question dtype: string splits: - name: train num_bytes: 29551239.588505447 num_examples: 6172 - name: validation num_bytes: 3284535.0547172287 num_examples: 686 - name: test num_bytes: 3653207.3567773257 num_examples: 763 download_size: 20939717 dataset_size: 36488982.0 ---
Anwaarma/MySentimentAnwarBig
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: '0': Negative '1': Positive splits: - name: train num_bytes: 3402656.0 num_examples: 14666 - name: test num_bytes: 271618.95179553545 num_examples: 1080 download_size: 1986961 dataset_size: 3674274.9517955356 --- # Dataset Card for "MySentimentAnwarBig" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HamdanXI/arb-eng-parallel-10k-splitted-euclidean-95
--- dataset_info: features: - name: arabic dtype: string - name: english dtype: string splits: - name: train num_bytes: 2506374 num_examples: 5227 - name: validation num_bytes: 407437 num_examples: 1000 - name: test num_bytes: 419389 num_examples: 1000 download_size: 1889682 dataset_size: 3333200 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
Minata/ast_method2test_v1
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: test num_bytes: 26340016 num_examples: 15716 download_size: 1879160 dataset_size: 26340016 configs: - config_name: default data_files: - split: test path: data/test-* ---
liuweihug/da
--- license: openrail ---
Nelis/prompts
--- license: unknown dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 19012346 num_examples: 73718 - name: test num_bytes: 3335114 num_examples: 13598 download_size: 10889277 dataset_size: 22347460 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
skrishna/gsm8k_only_answer
--- license: mit --- The data is exactly like the original GSM8k (https://huggingface.co/datasets/gsm8k ), but with the label consisting of the correct answer(one number) only.
stamimi/a
--- license: agpl-3.0 ---
danineld/tryout
--- license: bigscience-openrail-m ---
arthurmluz/GPTextSum_data-temario_results
--- dataset_info: features: - name: id dtype: int64 - name: text dtype: string - name: summary dtype: string - name: gen_summary dtype: string - name: rouge struct: - name: rouge1 dtype: float64 - name: rouge2 dtype: float64 - name: rougeL dtype: float64 - name: rougeLsum dtype: float64 - name: bert struct: - name: f1 sequence: float64 - name: hashcode dtype: string - name: precision sequence: float64 - name: recall sequence: float64 - name: moverScore dtype: float64 splits: - name: validation num_bytes: 38584 num_examples: 20 download_size: 47319 dataset_size: 38584 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "GPTextSum_data-temario_results" rouge= {'rouge1': 0.3521895422836724, 'rouge2': 0.18278167550878366, 'rougeL': 0.27857021634712387, 'rougeLsum': 0.27857021634712387} bert= {'precision': 0.700176528096199, 'recall': 0.8076501220464707, 'f1': 0.7497184872627258}
open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2
--- pretty_name: Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-01T23:34:01.291770](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2/blob/main/results_2024-03-01T23-34-01.291770.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653998983984977,\n\ \ \"acc_stderr\": 0.03205686303223958,\n \"acc_norm\": 0.6534966993814849,\n\ \ \"acc_norm_stderr\": 0.03272662727558437,\n \"mc1\": 0.627906976744186,\n\ \ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7746116985918058,\n\ \ \"mc2_stderr\": 0.013798217074345885\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\ \ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7151961760605458,\n\ \ \"acc_stderr\": 0.004503985839041969,\n \"acc_norm\": 0.8906592312288388,\n\ \ \"acc_norm_stderr\": 0.0031142850772280357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\ : 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\ \ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\ acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\ acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\ \ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\ acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\ acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\ acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\ \ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\ \ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\ \ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\ \ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\ \ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\ \ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\ \ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\ \ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\ \ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\ \ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\ \ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\ \ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \ \ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \ \ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\ \ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\ \ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\ \ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\ \ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\ \ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7746116985918058,\n\ \ \"mc2_stderr\": 0.013798217074345885\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \ \ \"acc_stderr\": 0.012679297549515427\n }\n}\n```" repo_url: https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|arc:challenge|25_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-01T23-34-01.291770.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|gsm8k|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hellaswag|10_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T23-34-01.291770.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T23-34-01.291770.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T23-34-01.291770.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_01T23_34_01.291770 path: - '**/details_harness|winogrande|5_2024-03-01T23-34-01.291770.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-01T23-34-01.291770.parquet' - config_name: results data_files: - split: 2024_03_01T23_34_01.291770 path: - results_2024-03-01T23-34-01.291770.parquet - split: latest path: - results_2024-03-01T23-34-01.291770.parquet --- # Dataset Card for Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-01T23:34:01.291770](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2/blob/main/results_2024-03-01T23-34-01.291770.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.653998983984977, "acc_stderr": 0.03205686303223958, "acc_norm": 0.6534966993814849, "acc_norm_stderr": 0.03272662727558437, "mc1": 0.627906976744186, "mc1_stderr": 0.01692109011881403, "mc2": 0.7746116985918058, "mc2_stderr": 0.013798217074345885 }, "harness|arc:challenge|25": { "acc": 0.7022184300341296, "acc_stderr": 0.01336308010724448, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710695 }, "harness|hellaswag|10": { "acc": 0.7151961760605458, "acc_stderr": 0.004503985839041969, "acc_norm": 0.8906592312288388, "acc_norm_stderr": 0.0031142850772280357 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778394, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778394 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726855, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726855 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768763, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768763 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297793, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.039955240076816806, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.039955240076816806 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.01577623925616323, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.01577623925616323 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993464, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993464 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4424581005586592, "acc_stderr": 0.016611393687268584, "acc_norm": 0.4424581005586592, "acc_norm_stderr": 0.016611393687268584 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818767, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818767 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4726205997392438, "acc_stderr": 0.012751075788015055, "acc_norm": 0.4726205997392438, "acc_norm_stderr": 0.012751075788015055 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8538011695906432, "acc_stderr": 0.027097290118070806, "acc_norm": 0.8538011695906432, "acc_norm_stderr": 0.027097290118070806 }, "harness|truthfulqa:mc|0": { "mc1": 0.627906976744186, "mc1_stderr": 0.01692109011881403, "mc2": 0.7746116985918058, "mc2_stderr": 0.013798217074345885 }, "harness|winogrande|5": { "acc": 0.8468823993685872, "acc_stderr": 0.010120623252272956 }, "harness|gsm8k|5": { "acc": 0.6952236542835482, "acc_stderr": 0.012679297549515427 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
adalib/monkey-sub-cond-gen
--- dataset_info: features: - name: code dtype: string splits: - name: train num_bytes: 7013607 num_examples: 650 download_size: 2540020 dataset_size: 7013607 configs: - config_name: default data_files: - split: train path: data/train-* ---
yardeny/processed_t5_small_context_len_64
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 9745169624.0 num_examples: 29710883 download_size: 3781295100 dataset_size: 9745169624.0 --- # Dataset Card for "processed_t5_small_context_len_64" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chiggly007/chirag_test_data
--- license: artistic-2.0 ---
ibunescu/california_tos_court_cases_v1
--- license: cc-by-nc-sa-4.0 ---
open-llm-leaderboard/details_Weyaxi__Einstein-bagel-7B
--- pretty_name: Evaluation run of Weyaxi/Einstein-bagel-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Weyaxi/Einstein-bagel-7B](https://huggingface.co/Weyaxi/Einstein-bagel-7B) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-bagel-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-24T13:17:52.314326](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-bagel-7B/blob/main/results_2024-01-24T13-17-52.314326.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6319266433643548,\n\ \ \"acc_stderr\": 0.032455580300510795,\n \"acc_norm\": 0.6389579812660248,\n\ \ \"acc_norm_stderr\": 0.03312727758951937,\n \"mc1\": 0.4589963280293758,\n\ \ \"mc1_stderr\": 0.017444544447661196,\n \"mc2\": 0.6333126520095955,\n\ \ \"mc2_stderr\": 0.015483397855951944\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.01409099561816848,\n\ \ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6609241187014538,\n\ \ \"acc_stderr\": 0.0047242814878193755,\n \"acc_norm\": 0.8481378211511651,\n\ \ \"acc_norm_stderr\": 0.003581537847581781\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\ \ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\ \ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\ \ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.028901593612411784,\n\ \ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.028901593612411784\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\ \ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\ \ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\ \ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\ \ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\ \ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"\ acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\ \ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\ \ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\ : 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945637,\n \"\ acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945637\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\ \ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.028657491285071977,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.028657491285071977\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\ acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\ acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\ acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\ acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \ \ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\ acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\ \ \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n\ \ \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\ \ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\ \ \"acc_stderr\": 0.01642881191589886,\n \"acc_norm\": 0.40670391061452515,\n\ \ \"acc_norm_stderr\": 0.01642881191589886\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\ \ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\ \ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\ \ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\ \ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \ \ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\ \ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\ \ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\ \ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862355,\n \ \ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862355\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\ \ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\ \ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n\ \ \"mc1_stderr\": 0.017444544447661196,\n \"mc2\": 0.6333126520095955,\n\ \ \"mc2_stderr\": 0.015483397855951944\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987726\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2812736921910538,\n \ \ \"acc_stderr\": 0.01238478931094024\n }\n}\n```" repo_url: https://huggingface.co/Weyaxi/Einstein-bagel-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|arc:challenge|25_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-24T13-17-52.314326.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|gsm8k|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hellaswag|10_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-17-52.314326.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-management|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-17-52.314326.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|truthfulqa:mc|0_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-24T13-17-52.314326.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_24T13_17_52.314326 path: - '**/details_harness|winogrande|5_2024-01-24T13-17-52.314326.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-24T13-17-52.314326.parquet' - config_name: results data_files: - split: 2024_01_24T13_17_52.314326 path: - results_2024-01-24T13-17-52.314326.parquet - split: latest path: - results_2024-01-24T13-17-52.314326.parquet --- # Dataset Card for Evaluation run of Weyaxi/Einstein-bagel-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-bagel-7B](https://huggingface.co/Weyaxi/Einstein-bagel-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-bagel-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-24T13:17:52.314326](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-bagel-7B/blob/main/results_2024-01-24T13-17-52.314326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6319266433643548, "acc_stderr": 0.032455580300510795, "acc_norm": 0.6389579812660248, "acc_norm_stderr": 0.03312727758951937, "mc1": 0.4589963280293758, "mc1_stderr": 0.017444544447661196, "mc2": 0.6333126520095955, "mc2_stderr": 0.015483397855951944 }, "harness|arc:challenge|25": { "acc": 0.6322525597269625, "acc_stderr": 0.01409099561816848, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.01375206241981783 }, "harness|hellaswag|10": { "acc": 0.6609241187014538, "acc_stderr": 0.0047242814878193755, "acc_norm": 0.8481378211511651, "acc_norm_stderr": 0.003581537847581781 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.042763494943765995, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.042763494943765995 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.038234289699266046, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.038234289699266046 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.028901593612411784, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.028901593612411784 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5276595744680851, "acc_stderr": 0.03263597118409769, "acc_norm": 0.5276595744680851, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404907, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404907 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945637, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945637 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015184, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652458, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652458 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7352941176470589, "acc_stderr": 0.028657491285071977, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.028657491285071977 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676177, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676177 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728745, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728745 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.04026187527591207, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.04026187527591207 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973133, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973133 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7109826589595376, "acc_stderr": 0.02440517393578323, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.02440517393578323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40670391061452515, "acc_stderr": 0.01642881191589886, "acc_norm": 0.40670391061452515, "acc_norm_stderr": 0.01642881191589886 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.024954184324879912, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.024954184324879912 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.025483115601195455, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.025483115601195455 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47522816166883963, "acc_stderr": 0.012754553719781753, "acc_norm": 0.47522816166883963, "acc_norm_stderr": 0.012754553719781753 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6601307189542484, "acc_stderr": 0.01916241858862355, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.01916241858862355 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128438, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128438 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072767, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072767 }, "harness|truthfulqa:mc|0": { "mc1": 0.4589963280293758, "mc1_stderr": 0.017444544447661196, "mc2": 0.6333126520095955, "mc2_stderr": 0.015483397855951944 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987726 }, "harness|gsm8k|5": { "acc": 0.2812736921910538, "acc_stderr": 0.01238478931094024 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
mHossain/indic_model_indic_test_data_paraphrase_detection
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: text dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 3849846.3 num_examples: 36000 - name: test num_bytes: 427760.7 num_examples: 4000 download_size: 1899118 dataset_size: 4277607.0 --- # Dataset Card for "indic_model_indic_test_data_paraphrase_detection" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
RikoteMaster/Emotion_Recognition_4_llama2
--- dataset_info: features: - name: Text_processed dtype: string - name: Emotion dtype: string - name: Augmented dtype: bool - name: text dtype: string splits: - name: train num_bytes: 23956262 num_examples: 61463 download_size: 8510226 dataset_size: 23956262 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Emotion_Recognition_4_llama2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/amiya_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of amiya/γ‚’γƒΌγƒŸγƒ€/ι˜Ώη±³ε¨… (Arknights) This is the dataset of amiya/γ‚’γƒΌγƒŸγƒ€/ι˜Ώη±³ε¨… (Arknights), containing 500 images and their tags. The core tags of this character are `animal_ears, brown_hair, rabbit_ears, long_hair, blue_eyes, hair_between_eyes, sidelocks, ponytail, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amiya_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 492.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amiya_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1291 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amiya_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 879.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amiya_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1291 | 1.68 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amiya_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/amiya_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 40 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, ascot, long_sleeves, open_jacket, solo, white_shirt, black_jacket, blue_skirt, looking_at_viewer, pleated_skirt, black_pantyhose, plaid_skirt, miniskirt, cowboy_shot, hooded_jacket, closed_mouth, multiple_rings, thumb_ring, brown_pantyhose | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_jacket, black_pantyhose, blue_skirt, full_body, long_sleeves, multiple_rings, open_jacket, pleated_skirt, shoes, solo, white_shirt, looking_at_viewer, plaid_skirt, black_footwear, rabbit_girl, thumb_ring, anklet, miniskirt, hood, parted_lips, blue_ascot, brown_pantyhose, coat, thighlet | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_jacket, bow_(music), holding_instrument, holding_violin, long_sleeves, open_jacket, playing_instrument, solo, white_shirt, black_pantyhose, blue_skirt, pleated_skirt, closed_mouth, miniskirt, multiple_rings, plaid_skirt, standing, blue_ascot, closed_eyes, cowboy_shot, hood, rabbit_girl, brown_pantyhose, grey_background, outdoors, simple_background | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, ascot, black_jacket, long_sleeves, looking_at_viewer, open_jacket, solo, upper_body, white_shirt, multiple_rings, hood, simple_background, closed_mouth, white_background, hand_up, parted_lips | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_jacket, open_jacket, solo, upper_body, white_shirt, blue_ascot, closed_mouth, simple_background, white_background, looking_at_viewer | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, headphones, long_sleeves, official_alternate_costume, solo, black_coat, holding_newspaper, looking_at_viewer, multiple_rings, open_clothes, white_dress, black_headwear, black_jacket, closed_mouth, baseball_cap, black_footwear, black_socks, ears_through_headwear, feet_out_of_frame, implied_extra_ears, outdoors, rabbit_girl, shoes, shoulder_bag, simple_background, white_background | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_jacket, brown_pantyhose, cleavage, long_sleeves, looking_at_viewer, medium_breasts, open_jacket, playboy_bunny, solo, strapless_leotard, bare_shoulders, black_leotard, black_pantyhose, blush, detached_collar, highleg_leotard, covered_navel, off_shoulder, open_coat, standing, anklet, ascot, black_coat, cowboy_shot, full_body, multiple_rings, open_mouth, parted_lips, simple_background, thighlet, thumb_ring, white_background | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, backpack, baseball_cap, ears_through_headwear, long_sleeves, official_alternate_costume, solo, white_jacket, closed_mouth, looking_at_viewer, open_jacket, white_bag, white_headwear, cowboy_shot, smile, standing, white_coat, holding_bouquet, neckerchief, plant, shirt, thigh_strap, white_background, white_flower | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blue_skirt, long_sleeves, looking_at_viewer, open_jacket, pleated_skirt, red_scarf, solo, white_thighhighs, official_alternate_costume, white_shirt, animal_ear_legwear, blue_jacket, day, outdoors, short_hair, bird, blush, off_shoulder, puffy_sleeves, scooter, smile, standing, unworn_headwear, unworn_helmet | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, hood_up, official_alternate_costume, rabbit_hood, solo, white_jacket, white_pants, hooded_jacket, long_sleeves, midriff, planet, crop_top, looking_at_viewer, navel, open_jacket, space, white_gloves, white_shirt, closed_mouth, cowboy_shot, medium_breasts | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, bare_shoulders, plaid, solo, straw_hat, blue_sky, blush, cloud, day, ears_through_headwear, looking_at_viewer, multiple_rings, outdoors, pink_shirt, red_flower, blue_choker, collarbone, hat_flower, ocean, off-shoulder_shirt, official_alternate_costume, smile, beach, brown_headwear, closed_mouth, feet_out_of_frame, hands_up, hibiscus, holding, open_mouth, puffy_short_sleeves, short_shorts, thighlet, upper_body | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ascot | long_sleeves | open_jacket | solo | white_shirt | black_jacket | blue_skirt | looking_at_viewer | pleated_skirt | black_pantyhose | plaid_skirt | miniskirt | cowboy_shot | hooded_jacket | closed_mouth | multiple_rings | thumb_ring | brown_pantyhose | full_body | shoes | black_footwear | rabbit_girl | anklet | hood | parted_lips | blue_ascot | coat | thighlet | bow_(music) | holding_instrument | holding_violin | playing_instrument | standing | closed_eyes | grey_background | outdoors | simple_background | upper_body | white_background | hand_up | headphones | official_alternate_costume | black_coat | holding_newspaper | open_clothes | white_dress | black_headwear | baseball_cap | black_socks | ears_through_headwear | feet_out_of_frame | implied_extra_ears | shoulder_bag | cleavage | medium_breasts | playboy_bunny | strapless_leotard | bare_shoulders | black_leotard | blush | detached_collar | highleg_leotard | covered_navel | off_shoulder | open_coat | open_mouth | backpack | white_jacket | white_bag | white_headwear | smile | white_coat | holding_bouquet | neckerchief | plant | shirt | thigh_strap | white_flower | red_scarf | white_thighhighs | animal_ear_legwear | blue_jacket | day | short_hair | bird | puffy_sleeves | scooter | unworn_headwear | unworn_helmet | hood_up | rabbit_hood | white_pants | midriff | planet | crop_top | navel | space | white_gloves | plaid | straw_hat | blue_sky | cloud | pink_shirt | red_flower | blue_choker | collarbone | hat_flower | ocean | off-shoulder_shirt | beach | brown_headwear | hands_up | hibiscus | holding | puffy_short_sleeves | short_shorts | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:---------------|:--------------|:-------|:--------------|:---------------|:-------------|:--------------------|:----------------|:------------------|:--------------|:------------|:--------------|:----------------|:---------------|:-----------------|:-------------|:------------------|:------------|:--------|:-----------------|:--------------|:---------|:-------|:--------------|:-------------|:-------|:-----------|:--------------|:---------------------|:-----------------|:---------------------|:-----------|:--------------|:------------------|:-----------|:--------------------|:-------------|:-------------------|:----------|:-------------|:-----------------------------|:-------------|:--------------------|:---------------|:--------------|:-----------------|:---------------|:--------------|:------------------------|:--------------------|:---------------------|:---------------|:-----------|:-----------------|:----------------|:--------------------|:-----------------|:----------------|:--------|:------------------|:------------------|:----------------|:---------------|:------------|:-------------|:-----------|:---------------|:------------|:-----------------|:--------|:-------------|:------------------|:--------------|:--------|:--------|:--------------|:---------------|:------------|:-------------------|:---------------------|:--------------|:------|:-------------|:-------|:----------------|:----------|:------------------|:----------------|:----------|:--------------|:--------------|:----------|:---------|:-----------|:--------|:--------|:---------------|:--------|:------------|:-----------|:--------|:-------------|:-------------|:--------------|:-------------|:-------------|:--------|:---------------------|:--------|:-----------------|:-----------|:-----------|:----------|:----------------------|:---------------| | 0 | 40 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | X | X | X | | X | X | X | X | X | | X | X | | X | | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | | X | | | | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | X | X | | X | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | X | | X | | X | | | | | | | X | X | | | | X | X | X | | | | | | | | | | | | | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | X | | X | | X | | | X | | | X | X | X | X | | | | X | | X | | | X | | | | | X | | | | X | | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | X | X | | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | X | X | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | | X | | | | X | | | | | | | X | X | | | | | | | | | | | | X | | | | | | | | X | | X | | | | X | | | | | | | | X | X | | | | | | | X | | X | | | | | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-44000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1079762 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
AdapterOcean/chemistry_dataset_standardized_embedded
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float32 splits: - name: train num_bytes: 126941686 num_examples: 19999 download_size: 60774351 dataset_size: 126941686 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "chemistry_dataset_standardized_embedded" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-moral_disputes-original-neg
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D splits: - name: test num_bytes: 16513.63583815029 num_examples: 53 download_size: 14796 dataset_size: 16513.63583815029 --- # Dataset Card for "mmlu-moral_disputes-original-neg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AlexFromSynlabs/sl1
--- license: other license_name: sl1 license_link: LICENSE ---
autoevaluate/autoeval-eval-inverse-scaling__41-inverse-scaling__41-aa9680-1691959549
--- type: predictions tags: - autotrain - evaluation datasets: - inverse-scaling/41 eval_info: task: text_zero_shot_classification model: inverse-scaling/opt-6.7b_eval metrics: [] dataset_name: inverse-scaling/41 dataset_config: inverse-scaling--41 dataset_split: train col_mapping: text: prompt classes: classes target: answer_index --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: inverse-scaling/opt-6.7b_eval * Dataset: inverse-scaling/41 * Config: inverse-scaling--41 * Split: train To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model.
yzhuang/autotree_pmlb_10000_letter_sgosdt_l256_dim10_d3_sd0
--- dataset_info: features: - name: id dtype: int64 - name: input_x sequence: sequence: float32 - name: input_y sequence: sequence: float32 - name: input_y_clean sequence: sequence: float32 - name: rtg sequence: float64 - name: status sequence: sequence: float32 - name: split_threshold sequence: sequence: float32 - name: split_dimension sequence: int64 splits: - name: train num_bytes: 706482624 num_examples: 10000 - name: validation num_bytes: 708636096 num_examples: 10000 download_size: 54846762 dataset_size: 1415118720 --- # Dataset Card for "autotree_pmlb_10000_letter_sgosdt_l256_dim10_d3_sd0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
multi-train/emb-reddit-title-body
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: query dtype: string - name: pos dtype: string - name: idx dtype: int64 - name: task_name dtype: string splits: - name: train num_bytes: 95637449375 num_examples: 127445911 download_size: 3302152777 dataset_size: 95637449375 --- # Dataset Card for "emb-reddit-title-body" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Skepsun/huozi_rlhf_data_json
--- license: apache-2.0 language: - zh pretty_name: hz_rlhf --- Converted from: https://github.com/HIT-SCIR/huozi
launch/open_question_type
--- annotations_creators: - expert-generated language: - en license: - cc-by-4.0 multilinguality: - monolingual task_categories: - text-classification task_ids: [] pretty_name: OpenQuestionType --- # Dataset Card for OpenQuestionType ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://shuyangcao.github.io/projects/ontology_open_ended_question/](https://shuyangcao.github.io/projects/ontology_open_ended_question/) - **Repository:** [https://github.com/ShuyangCao/open-ended_question_ontology](https://github.com/ShuyangCao/open-ended_question_ontology) - **Paper:** [https://aclanthology.org/2021.acl-long.502/](https://aclanthology.org/2021.acl-long.502/) - **Leaderboard:** [Needs More Information] - **Point of Contact:** [Needs More Information] ### Dataset Summary Question types annotated on open-ended questions. ### Supported Tasks and Leaderboards [More Information Needed] ### Languages English ## Dataset Structure ### Data Instances An example looks as follows. ``` { "id": "123", "question": "A test question?", "annotator1": ["verification", None], "annotator2": ["concept", None], "resolve_type": "verification" } ``` ### Data Fields - `id`: a `string` feature. - `question`: a `string` feature. - `annotator1`: a sequence feature containing two elements. The first one is the most confident label by the first annotator and the second one is the second-most confident label by the first annotator. - `annotator2`: a sequence feature containing two elements. The first one is the most confident label by the second annotator and the second one is the second-most confident label by the second annotator. - `resolve_type`: a `string` feature which is the final label after resolving disagreement. ### Data Splits - train: 3716 - valid: 580 - test: 660 ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? Yahoo Answer and Reddit users. ### Personal and Sensitive Information None. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information CC BY 4.0 ### Citation Information ``` @inproceedings{cao-wang-2021-controllable, title = "Controllable Open-ended Question Generation with A New Question Type Ontology", author = "Cao, Shuyang and Wang, Lu", booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)", month = aug, year = "2021", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.acl-long.502", doi = "10.18653/v1/2021.acl-long.502", pages = "6424--6439", abstract = "We investigate the less-explored task of generating open-ended questions that are typically answered by multiple sentences. We first define a new question type ontology which differentiates the nuanced nature of questions better than widely used question words. A new dataset with 4,959 questions is labeled based on the new ontology. We then propose a novel question type-aware question generation framework, augmented by a semantic graph representation, to jointly predict question focuses and produce the question. Based on this framework, we further use both exemplars and automatically generated templates to improve controllability and diversity. Experiments on two newly collected large-scale datasets show that our model improves question quality over competitive comparisons based on automatic metrics. Human judges also rate our model outputs highly in answerability, coverage of scope, and overall quality. Finally, our model variants with templates can produce questions with enhanced controllability and diversity.", } ```
ozayezerceli/text-to-cypher-engQuestions
--- dataset_info: features: - name: input_text dtype: string - name: output_text dtype: string splits: - name: train num_bytes: 91708 num_examples: 120 download_size: 16915 dataset_size: 91708 configs: - config_name: default data_files: - split: train path: data/train-* ---
cun-bjy/mpi3d_real
--- license: bsd task_categories: - feature-extraction language: - ar tags: - code pretty_name: mpi3d_real size_categories: - 100M<n<1B --- # mpi3d_real This repository is a _unofficial_ backup service for `MPI3D-Real` dataset, which is provided in this paper: [**On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset**](https://proceedings.neurips.cc/paper/2019/hash/d97d404b6119214e4a7018391195240a-Abstract.html). The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable. For more detailed information on the dataset, please check the [original repository](https://github.com/rr-learning/disentanglement_dataset) ## Reference [1] Gondal, Muhammad Waleed, et al. "On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset." Advances in Neural Information Processing Systems 32 (2019).
irds/mmarco_zh_dev
--- pretty_name: '`mmarco/zh/dev`' viewer: false source_datasets: ['irds/mmarco_zh'] task_categories: - text-retrieval --- # Dataset Card for `mmarco/zh/dev` The `mmarco/zh/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/zh/dev). # Data This dataset provides: - `queries` (i.e., topics); count=101,093 - `qrels`: (relevance assessments); count=59,273 - For `docs`, use [`irds/mmarco_zh`](https://huggingface.co/datasets/irds/mmarco_zh) This dataset is used by: [`mmarco_zh_dev_v1.1`](https://huggingface.co/datasets/irds/mmarco_zh_dev_v1.1) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/mmarco_zh_dev', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/mmarco_zh_dev', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in πŸ€— Dataset format. ## Citation Information ``` @article{Bonifacio2021MMarco, title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset}, author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira}, year={2021}, journal={arXiv:2108.13897} } ```
GATE-engine/aircraft
--- dataset_info: features: - name: image dtype: image - name: label dtype: int64 splits: - name: train num_bytes: 448764494.0 num_examples: 7000 - name: validation num_bytes: 96057035.5 num_examples: 1500 - name: test num_bytes: 96864989.5 num_examples: 1500 download_size: 641660472 dataset_size: 641686519.0 --- # Dataset Card for "aircraft" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Nexdata/Korean_Speaking_English_Speech_Data_by_Mobile_Phone
--- YAML tags: - copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging --- # Dataset Card for Nexdata/Korean_Speaking_English_Speech_Data_by_Mobile_Phone ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://www.nexdata.ai/datasets/1041?source=Huggingface - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary The product contains the speech data recorded by 400 native Korean speakers, with roughly equal gender distribution. The corpus covers a wide domain with rich content of generic category, human-machine interaction category, in-car category, smart home category, etc. The corpus text was manually checked to ensure the high accuracy. For more details, please refer to the link: https://www.nexdata.ai/datasets/1041?source=Huggingface ### Supported Tasks and Leaderboards automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR). ### Languages Korean English ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing ### Citation Information [More Information Needed] ### Contributions
EleutherAI/quirky_population_alice_easy
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: alice_label dtype: bool - name: bob_label dtype: bool - name: difficulty dtype: float64 - name: statement dtype: string - name: choices sequence: string - name: character dtype: string - name: label dtype: bool splits: - name: train num_bytes: 100481.52996129722 num_examples: 936 - name: validation num_bytes: 52277.989 num_examples: 487 - name: test num_bytes: 62218.195 num_examples: 580 download_size: 59136 dataset_size: 214977.71396129724 --- # Dataset Card for "quirky_population_alice_easy" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Aditi-Gp/flan_v2_processed_dataset
--- dataset_info: features: - name: 'Claim: "Only people named Floyd wearing pink are allowed to attend Pink Floyd concerts."\nIs the claim above correct, and can it be verified by human common sense and without a web search?\nOptions:\n- yes\n- no' dtype: string - name: 'no' dtype: string - name: The rock group would not be as popular is they had such requirements for their concerts. dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 1621790 num_examples: 6912 download_size: 868274 dataset_size: 1621790 configs: - config_name: default data_files: - split: train path: data/train-* --- # Arakoo Internship Assignment ## By Aditi Gupta ## Finding the Training Set I extracted the training set from this Github repository https://github.com/google-research/FLAN/tree/main/flan/v2/cot_data. However, since multiple .tsv files were available, I tried to append them all together in the dataset which resulted in too many NULL values. Hence, only one training dataset is considered. ## Removed instructions with less than 100 tokens in response. Using sklearn open-source library by Python, TfidfVectorizer I removed instructions with tokens<100. ## Data deduplication by doing grouping using cosine similarity (threshold>0.95) From sklearn.metrics.pairwise, I imported cosine_similarity to do grouping using cosine similarity
HamdanXI/arb-eng-parallel-10k-splitted-euclidean-100
--- dataset_info: features: - name: arabic dtype: string - name: english dtype: string splits: - name: train num_bytes: 3054333 num_examples: 6636 - name: validation num_bytes: 407437 num_examples: 1000 - name: test num_bytes: 419389 num_examples: 1000 download_size: 2193837 dataset_size: 3881159 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
CyberHarem/tiamo_fireemblem
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of tiamo (Fire Emblem) This is the dataset of tiamo (Fire Emblem), containing 449 images and their tags. The core tags of this character are `long_hair, red_hair, red_eyes, breasts, hair_ornament, hair_between_eyes, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 449 | 532.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 449 | 324.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1004 | 622.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 449 | 482.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1004 | 835.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/tiamo_fireemblem', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, looking_at_viewer, navel, nipples, smile, solo, large_breasts, completely_nude, female_pubic_hair, pussy | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, navel, nipples, solo, thighhighs, looking_at_viewer, pussy, elbow_gloves, medium_breasts, nude, small_breasts, smile | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1boy, 1girl, blush, completely_nude, hetero, nipples, sex, solo_focus, vaginal, mosaic_censoring, navel, open_mouth, penis, pussy, spread_legs, small_breasts, medium_breasts, sweat, missionary, on_back, pov, looking_at_viewer, pillow | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, anus, blush, completely_nude, hetero, looking_at_viewer, looking_back, mosaic_censoring, open_mouth, penis, pussy, solo_focus, vaginal, medium_breasts, nipples, sex_from_behind, ass_grab, girl_on_top, reverse_cowgirl_position, indoors, spread_legs, sweat, wing_hair_ornament | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, garter_straps, gauntlets, solo, thighhighs, thigh_boots, spear, breastplate, looking_at_viewer, belt | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, garter_straps, holding_weapon, solo, thigh_boots, thighhighs, breastplate, feathers, gloves, looking_at_viewer, red_dress, short_dress, smile, spear, wing_hair_ornament, gauntlets, shoulder_armor, zettai_ryouiki | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, gauntlets, smile, solo, looking_at_viewer, polearm, breastplate, holding_weapon, simple_background, white_background | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, full_body, holding_bow_(weapon), solo, wedding_dress, white_dress, high_heels, looking_at_viewer, bride, gloves, simple_background, smile, bare_shoulders, bridal_gauntlets, grey_background, holding_arrow, one_eye_closed, open_mouth, white_background | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | wedding_dress, 1girl, bare_shoulders, blush, looking_at_viewer, smile, solo, white_dress, bride, pearl_necklace | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, solo, bare_shoulders, holding_weapon, looking_at_viewer, navel, red_bikini, fingerless_gloves, collarbone, fish, smile, bangs, bikini_skirt, cleavage, full_body, simple_background, spear, blush, sandals, small_breasts, toeless_footwear | | 10 | 14 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, looking_at_viewer, solo, red_bikini, navel, smile, blush, sky, small_breasts, upper_body, collarbone, bare_shoulders, cloud, day, bangs, open_mouth, outdoors, wing_hair_ornament | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | navel | nipples | smile | solo | large_breasts | completely_nude | female_pubic_hair | pussy | thighhighs | elbow_gloves | medium_breasts | nude | small_breasts | 1boy | hetero | sex | solo_focus | vaginal | mosaic_censoring | open_mouth | penis | spread_legs | sweat | missionary | on_back | pov | pillow | anus | looking_back | sex_from_behind | ass_grab | girl_on_top | reverse_cowgirl_position | indoors | wing_hair_ornament | garter_straps | gauntlets | thigh_boots | spear | breastplate | belt | holding_weapon | feathers | gloves | red_dress | short_dress | shoulder_armor | zettai_ryouiki | polearm | simple_background | white_background | full_body | holding_bow_(weapon) | wedding_dress | white_dress | high_heels | bride | bare_shoulders | bridal_gauntlets | grey_background | holding_arrow | one_eye_closed | pearl_necklace | red_bikini | fingerless_gloves | collarbone | fish | bangs | bikini_skirt | cleavage | sandals | toeless_footwear | sky | upper_body | cloud | day | outdoors | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:--------|:----------|:--------|:-------|:----------------|:------------------|:--------------------|:--------|:-------------|:---------------|:-----------------|:-------|:----------------|:-------|:---------|:------|:-------------|:----------|:-------------------|:-------------|:--------|:--------------|:--------|:-------------|:----------|:------|:---------|:-------|:---------------|:------------------|:-----------|:--------------|:---------------------------|:----------|:---------------------|:----------------|:------------|:--------------|:--------|:--------------|:-------|:-----------------|:-----------|:---------|:------------|:--------------|:-----------------|:-----------------|:----------|:--------------------|:-------------------|:------------|:-----------------------|:----------------|:--------------|:-------------|:--------|:-----------------|:-------------------|:------------------|:----------------|:-----------------|:-----------------|:-------------|:--------------------|:-------------|:-------|:--------|:---------------|:-----------|:----------|:-------------------|:------|:-------------|:--------|:------|:-----------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | | | X | | X | | | X | | | X | X | | X | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | X | | | | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | X | | X | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | 10 | 14 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | | X | | | | | X | X | X | X | X |
csujeong/kullm-v2.1
--- license: apache-2.0 task_categories: - text-generation language: - ko pretty_name: kullm size_categories: - 10K<n<100K --- # Dataset Card for "KULLM-v2" ## Dataset Summary Korean translation of GPT4ALL, Dolly, and Vicuna data. repository: [nlpai-lab/KULLM](https://github.com/nlpai-lab/KULLM) huggingface: [nlpai-lab/kullm-v2](https://huggingface.co/nlpai-lab/kullm-polyglot-12.8b-v2) #### Translate dataset Translated 'instruction', 'input', and 'output' in the dataset via the DeepL API ## Lisence Apache-2.0 ```python >>> from datasets import load_dataset >>> ds = load_dataset("nlpai-lab/kullm-v2", split="train") >>> ds DatasetDict({ train: Dataset({ features: ['id', 'instruction', 'input', 'output'], num_rows: 152630 }) }) ``` ```python >>> ds[0] {'id': 'alpaca_{idx}', 'instruction': '3μ›μƒ‰μ΄λž€ λ¬΄μ—‡μΈκ°€μš”?', 'input': '', 'output': 'μ„Έ κ°€μ§€ κΈ°λ³Έ 색은 λΉ¨κ°•, νŒŒλž‘, λ…Έλž‘μž…λ‹ˆλ‹€. 이 색은 λ‹€λ₯Έ 색을 ν˜Όν•©ν•˜μ—¬ λ§Œλ“€ 수 μ—†κ³  λ‹€λ₯Έ λͺ¨λ“  색은 λ‹€μ–‘ν•œ λΉ„μœ¨λ‘œ μ‘°ν•©ν•˜μ—¬ λ§Œλ“€ 수 있기 λ•Œλ¬Έμ— 원색이라고 λΆ€λ¦…λ‹ˆλ‹€. 빛에 μ‚¬μš©λ˜λŠ” μ²¨κ°€μ œ 색상 μ‹œμŠ€ν…œμ—μ„œ 원색은 λΉ¨κ°•, 녹색, νŒŒλž‘(RGB)μž…λ‹ˆλ‹€.'} ```
mathews5546/mathewzvk-starter-llama2-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 966692 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* ---
tab_fact
--- annotations_creators: - crowdsourced language_creators: - crowdsourced language: - en license: - cc-by-4.0 multilinguality: - monolingual size_categories: - 100K<n<1M source_datasets: - original task_categories: - text-classification task_ids: - fact-checking paperswithcode_id: tabfact pretty_name: TabFact dataset_info: - config_name: tab_fact features: - name: id dtype: int32 - name: table_id dtype: string - name: table_text dtype: string - name: table_caption dtype: string - name: statement dtype: string - name: label dtype: class_label: names: '0': refuted '1': entailed splits: - name: train num_bytes: 99852664 num_examples: 92283 - name: validation num_bytes: 13846872 num_examples: 12792 - name: test num_bytes: 13493391 num_examples: 12779 download_size: 196508436 dataset_size: 127192927 - config_name: blind_test features: - name: id dtype: int32 - name: table_id dtype: string - name: table_text dtype: string - name: table_caption dtype: string - name: statement dtype: string - name: test_id dtype: string splits: - name: test num_bytes: 10954442 num_examples: 9750 download_size: 196508436 dataset_size: 10954442 --- # Dataset Card for TabFact ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [TabFact](https://tabfact.github.io/index.html) - **Repository:** [GitHub](https://github.com/wenhuchen/Table-Fact-Checking) - **Paper:** [TabFact: A Large-scale Dataset for Table-based Fact Verification](https://arxiv.org/abs/1909.02164) - **Leaderboard:** [Leaderboard](https://competitions.codalab.org/competitions/21611) - **Point of Contact:** [Wenhu Chen](wenhuchen@cs.ucsb.edu) ### Dataset Summary The problem of verifying whether a textual hypothesis holds the truth based on the given evidence, also known as fact verification, plays an important role in the study of natural language understanding and semantic representation. However, existing studies are restricted to dealing with unstructured textual evidence (e.g., sentences and passages, a pool of passages), while verification using structured forms of evidence, such as tables, graphs, and databases, remains unexplored. TABFACT is large scale dataset with 16k Wikipedia tables as evidence for 118k human annotated statements designed for fact verification with semi-structured evidence. The statements are labeled as either ENTAILED or REFUTED. TABFACT is challenging since it involves both soft linguistic reasoning and hard symbolic reasoning. ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data [More Information Needed] #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations [More Information Needed] #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information ``` @inproceedings{2019TabFactA, title={TabFact : A Large-scale Dataset for Table-based Fact Verification}, author={Wenhu Chen, Hongmin Wang, Jianshu Chen, Yunkai Zhang, Hong Wang, Shiyang Li, Xiyou Zhou and William Yang Wang}, booktitle = {International Conference on Learning Representations (ICLR)}, address = {Addis Ababa, Ethiopia}, month = {April}, year = {2020} } ``` ### Contributions Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset.
growth-cadet/jobpost_signals-to-json_test_mistral02gen
--- dataset_info: features: - name: id dtype: string - name: ats dtype: string - name: context dtype: string - name: context_token_count dtype: int64 - name: gpt-4_response dtype: string - name: gpt-4_cost dtype: float64 - name: gpt-4_sys5_response dtype: string - name: gpt-4_sys5_cost dtype: float64 - name: sys5_obj struct: - name: focus_areas list: - name: description dtype: string - name: subject dtype: string - name: industries list: - name: description dtype: string - name: subject dtype: string - name: products_and_technologies list: - name: description dtype: string - name: subject dtype: string - name: mistral02_gen dtype: string splits: - name: train num_bytes: 18652315 num_examples: 1806 download_size: 8459373 dataset_size: 18652315 configs: - config_name: default data_files: - split: train path: data/train-* ---