datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
giodeleo/test
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1297 num_examples: 5 download_size: 2807 dataset_size: 1297 configs: - config_name: default data_files: - split: train path: data/train-* ---
RPATaskForce/DocMPS
--- license: llama2 language: - en size_categories: - n<1K ---
e-mohammadii/sss
--- license: creativeml-openrail-m task_categories: - token-classification language: - ae tags: - biology pretty_name: some size_categories: - 10K<n<100K ---
Aditya000001/TestDatasetForRA
--- license: wtfpl --- --- tags: - transportation - trains - travel data - english --- # Dataset Description ## General Information - **Title**: TrainInfo2023 - **Description**: This dataset contains information about train schedules, routes, and passenger statistics for the year 2023. - **Version**: 1.0 - **Author**: [Your Name or Organization] - **License**: [Appropriate License, e.g., MIT, CC BY 4.0] - **URL**: [Link to where the dataset can be downloaded or accessed] ## Dataset Structure ### Data Instances A sample entry from the dataset: ```json { "train_id": "12345A", "route": "North-East", "departure_time": "2023-01-01 08:00:00", "arrival_time": "2023-01-01 12:00:00", "passenger_count": 200, "station_details": [ {"station_name": "Station A", "arrival": "09:00", "departure": "09:10"}, {"station_name": "Station B", "arrival": "10:00", "departure": "10:15"} ] }
astvito/Polachek
--- license: apache-2.0 ---
nlp-brin-id/unsup-title
--- license: mit task_categories: - text-classification language: - id size_categories: - 10K<n<100K --- This dataset is taken from nlp-brin-id/id-hoax-report-merge-v2. </br> The subsets can be utilized as samples for unsupervised contrastive learning based on short claim as input for fake news detection. </br> Attributes used = 'Title'.</br>
joshuasundance/govgis_nov2023
--- language: - en tags: - gis - geospatial license: mit size_categories: - 100K<n<1M --- # govgis_nov2023 🤖 This README was written by GPT-4. 🤖 `govgis_nov2023` is an extensive compilation of metadata, documenting geospatial data from known government servers as of November 15 2023. This should provide a rich resource for GIS analysis, research, and application development. These datasets contain data from various Federal, State, County, and City ArcGIS Servers listed by Joseph Elfelt of [Mapping Support](https://mappingsupport.com). It serves as a unique snapshot capturing the state of these servers in November 2023. This repo contains the [very messy] notebooks with the code used to compile the data and save it in parquet format. ## Overview - Content: Includes three primary files: servers.parquet, services.parquet, and layers.parquet, offering detailed insights into numerous GIS servers and layers. - Size and Scope: The dataset covers data from 1684 servers, detailing almost a million individual layers with extensive metadata including field information for feature layers, cell size for raster layers, etc. - Format: Data is stored in Parquet format, facilitating efficient storage and quick access. - Status: This is a static snapshot and not actively maintained like Joseph Elfelt’s ongoing listings. However, this foundation may evolve into a maintained index. ## Data Collection - Tools & Libraries Used: Data was collected using the [`restgdf`](https://github.com/joshuasundance-swca/restgdf) library, designed for efficient and asynchronous interaction with ArcGIS servers. - Process: The dataset was created by scraping information from a wide range of ArcGIS servers, focusing on capturing a comprehensive and detailed snapshot as of November 2023. - Verification: While data integrity was a focus, the dataset was not subjected to extensive cleaning, preserving the raw and detailed nature of the information. ## Data Processing - Data Cleaning: Minimal cleaning was conducted to maintain the dataset's comprehensive and raw nature, allowing users to filter and process data as needed. - Data Transformation: Collected data was standardized and converted into Parquet format for ease of use and accessibility. ## Use Cases The `govgis_nov2023` dataset can be utilized for: - Educational and Research Purposes: A valuable resource for GIS students, educators, and researchers. - Geospatial Data Analysis: Ideal for analysts and data scientists for conducting extensive geospatial analyses. - GIS Application Development: Useful for developers in building or enhancing GIS-related applications. - Language Model Integration: The dataset can be used to train or evaluate language models for generating descriptions or summaries of GIS data. ## Conclusion - Creation: This dataset was created using the restgdf library, emphasizing the potential of open-source contributions in the GIS field. - Data Source: The dataset comprises data from publicly accessible ArcGIS servers. The dataset creator has no affiliation with Joseph Elfelt, MappingSupport.com, or the servers' respective owners.
CyberHarem/rapunzel_nikke
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of rapunzel/ラプンツェル/长发公主/라푼젤 (Nikke: Goddess of Victory) This is the dataset of rapunzel/ラプンツェル/长发公主/라푼젤 (Nikke: Goddess of Victory), containing 72 images and their tags. The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, bangs, large_breasts, very_long_hair, braid`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 72 | 130.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapunzel_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 72 | 63.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapunzel_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 170 | 137.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapunzel_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 72 | 110.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapunzel_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 170 | 209.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rapunzel_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/rapunzel_nikke', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, nun, bodysuit, looking_at_viewer, gloves, habit, smile, open_mouth, blush, braided_ponytail, holding, dress | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | nun | bodysuit | looking_at_viewer | gloves | habit | smile | open_mouth | blush | braided_ponytail | holding | dress | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------|:-----------|:--------------------|:---------|:--------|:--------|:-------------|:--------|:-------------------|:----------|:--------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
AkashMnd/PipelineSelector
--- license: mit ---
maryantocinn/indosum
--- tags: - summarization language: - ind --- # indosum INDOSUM is a new benchmark dataset for Indonesian text summarization. The dataset consists of news articles and manually constructed summaries. ## Dataset Usage Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`. ## Citation ``` @INPROCEEDINGS{8629109, author={Kurniawan, Kemal and Louvan, Samuel}, booktitle={2018 International Conference on Asian Language Processing (IALP)}, title={Indosum: A New Benchmark Dataset for Indonesian Text Summarization}, year={2018}, volume={}, number={}, pages={215-220}, doi={10.1109/IALP.2018.8629109}} ``` ## License Apache License, Version 2.0 ## Homepage [https://github.com/kata-ai/indosum](https://github.com/kata-ai/indosum) ### NusaCatalogue For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue)
zolak/twitter_dataset_1713024499
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: float64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 66257828 num_examples: 169986 download_size: 33787909 dataset_size: 66257828 configs: - config_name: default data_files: - split: train path: data/train-* ---
khoomeik/gzipscale-0.32-10_500_5_10-100M
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 355211686 num_examples: 390625 download_size: 49419273 dataset_size: 355211686 configs: - config_name: default data_files: - split: train path: data/train-* ---
arthurmluz/temario_data-wiki_gptextsum_results
--- dataset_info: features: - name: id dtype: string - name: text dtype: string - name: summary dtype: string - name: gen_summary dtype: string - name: rouge struct: - name: rouge1 dtype: float64 - name: rouge2 dtype: float64 - name: rougeL dtype: float64 - name: rougeLsum dtype: float64 - name: bert struct: - name: f1 sequence: float64 - name: hashcode dtype: string - name: precision sequence: float64 - name: recall sequence: float64 splits: - name: validation num_bytes: 208005 num_examples: 25 download_size: 164069 dataset_size: 208005 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "temario_data-wiki_gptextsum_results" rouge= {'rouge1': 0.21036975294101332, 'rouge2': 0.07970392536191843, 'rougeL': 0.1477604081207584, 'rougeLsum': 0.1477604081207584} bert= {'precision': 0.7488837575912476, 'recall': 0.6433243179321289, 'f1': 0.6917135095596314}
pyp1/VoiceCraft_RealEdit
--- license: cc-by-nc-sa-4.0 ---
gimmaru/tweet_eval-irony
--- dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: '0': non_irony '1': irony splits: - name: test num_bytes: 75897 num_examples: 784 download_size: 0 dataset_size: 75897 --- # Dataset Card for "tweet_eval-irony" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) Note: This dataset was utilized for the evaluation of probability-based prompt selection techniques in the paper '[Improving Probability-based Prompt Selection Through Unified Evaluation and Analysis](https://arxiv.org/abs/2305.14877)'. It differs from the actual benchmark dataset.
gothstaf/questillma2
--- license: openrail ---
lirus18/deepfashion_with_captions_blowout
--- dataset_info: features: - name: image dtype: image - name: openpose dtype: image - name: cloth dtype: image - name: caption dtype: string splits: - name: train num_bytes: 4597369002.709 num_examples: 13679 download_size: 4429889834 dataset_size: 4597369002.709 --- # Dataset Card for "deepfashion_with_captions_blowout" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lhoestq/test
--- type: test annotations_creators: - expert-generated language_creators: - found language: - en license: - mit multilinguality: - monolingual size_categories: - n<1K source_datasets: - original task_categories: - other-test task_ids: - other-test paperswithcode_id: null pretty_name: Test Dataset --- This is a test dataset
gabasbch/icezada
--- license: openrail ---
one-sec-cv12/chunk_24
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 16896860208.875 num_examples: 175921 download_size: 15392804768 dataset_size: 16896860208.875 --- # Dataset Card for "chunk_24" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Isotonic__TinyQwex-4x620M-MoE
--- pretty_name: Evaluation run of Isotonic/TinyQwex-4x620M-MoE dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Isotonic/TinyQwex-4x620M-MoE](https://huggingface.co/Isotonic/TinyQwex-4x620M-MoE)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__TinyQwex-4x620M-MoE\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-07T02:55:53.030470](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__TinyQwex-4x620M-MoE/blob/main/results_2024-04-07T02-55-53.030470.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2424585468194838,\n\ \ \"acc_stderr\": 0.030320769072797325,\n \"acc_norm\": 0.24309999016079853,\n\ \ \"acc_norm_stderr\": 0.03112669387630501,\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4822967947124386,\n\ \ \"mc2_stderr\": 0.016765493106593414\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.21160409556313994,\n \"acc_stderr\": 0.011935916358632847,\n\ \ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25582553276239794,\n\ \ \"acc_stderr\": 0.004354325017137537,\n \"acc_norm\": 0.2619000199163513,\n\ \ \"acc_norm_stderr\": 0.004387699525854879\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\ \ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\ \ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\ \ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \ \ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\ \ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\ : 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \ \ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\ \ \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n\ \ \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\ \ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\ \ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\ \ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\ \ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\ \ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\ \ \"acc_stderr\": 0.025649381063029247,\n \"acc_norm\": 0.2838709677419355,\n\ \ \"acc_norm_stderr\": 0.025649381063029247\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\ \ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\ : 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\ acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n\ \ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \ \ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \ \ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\ \ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\ acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.24220183486238533,\n \"acc_stderr\": 0.01836817630659862,\n \"\ acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.01836817630659862\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\ acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\ acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \ \ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3273542600896861,\n\ \ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.3273542600896861,\n\ \ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\ \ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\ acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\ \ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\ \ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\ \ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\ \ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\ \ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\ \ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\ \ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\ \ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\ \ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\ \ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\ \ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\ \ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\ \ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \ \ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\ \ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\ \ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\ \ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \ \ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\ \ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\ \ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\ \ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\ \ \"acc_stderr\": 0.03106939026078943,\n \"acc_norm\": 0.19879518072289157,\n\ \ \"acc_norm_stderr\": 0.03106939026078943\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\ \ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4822967947124386,\n\ \ \"mc2_stderr\": 0.016765493106593414\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.01405195606407689\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/Isotonic/TinyQwex-4x620M-MoE leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|arc:challenge|25_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-07T02-55-53.030470.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|gsm8k|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hellaswag|10_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-55-53.030470.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-management|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-55-53.030470.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|truthfulqa:mc|0_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-07T02-55-53.030470.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_07T02_55_53.030470 path: - '**/details_harness|winogrande|5_2024-04-07T02-55-53.030470.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-07T02-55-53.030470.parquet' - config_name: results data_files: - split: 2024_04_07T02_55_53.030470 path: - results_2024-04-07T02-55-53.030470.parquet - split: latest path: - results_2024-04-07T02-55-53.030470.parquet --- # Dataset Card for Evaluation run of Isotonic/TinyQwex-4x620M-MoE <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Isotonic/TinyQwex-4x620M-MoE](https://huggingface.co/Isotonic/TinyQwex-4x620M-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Isotonic__TinyQwex-4x620M-MoE", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-07T02:55:53.030470](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__TinyQwex-4x620M-MoE/blob/main/results_2024-04-07T02-55-53.030470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2424585468194838, "acc_stderr": 0.030320769072797325, "acc_norm": 0.24309999016079853, "acc_norm_stderr": 0.03112669387630501, "mc1": 0.22766217870257038, "mc1_stderr": 0.01467925503211107, "mc2": 0.4822967947124386, "mc2_stderr": 0.016765493106593414 }, "harness|arc:challenge|25": { "acc": 0.21160409556313994, "acc_stderr": 0.011935916358632847, "acc_norm": 0.2627986348122867, "acc_norm_stderr": 0.012862523175351333 }, "harness|hellaswag|10": { "acc": 0.25582553276239794, "acc_stderr": 0.004354325017137537, "acc_norm": 0.2619000199163513, "acc_norm_stderr": 0.004387699525854879 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2518518518518518, "acc_stderr": 0.037498507091740206, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.037498507091740206 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123387, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123387 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2679245283018868, "acc_stderr": 0.027257260322494845, "acc_norm": 0.2679245283018868, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.15, "acc_stderr": 0.03588702812826372, "acc_norm": 0.15, "acc_norm_stderr": 0.03588702812826372 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173042, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173042 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179961, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179961 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610334, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610334 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.04227054451232199, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.04227054451232199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2206896551724138, "acc_stderr": 0.03455930201924811, "acc_norm": 0.2206896551724138, "acc_norm_stderr": 0.03455930201924811 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276864, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276864 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2838709677419355, "acc_stderr": 0.025649381063029247, "acc_norm": 0.2838709677419355, "acc_norm_stderr": 0.025649381063029247 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.031270907132976984, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.031270907132976984 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24242424242424243, "acc_stderr": 0.03346409881055953, "acc_norm": 0.24242424242424243, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.21717171717171718, "acc_stderr": 0.029376616484945637, "acc_norm": 0.21717171717171718, "acc_norm_stderr": 0.029376616484945637 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.20725388601036268, "acc_stderr": 0.02925282329180362, "acc_norm": 0.20725388601036268, "acc_norm_stderr": 0.02925282329180362 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2205128205128205, "acc_stderr": 0.02102067268082791, "acc_norm": 0.2205128205128205, "acc_norm_stderr": 0.02102067268082791 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23109243697478993, "acc_stderr": 0.027381406927868966, "acc_norm": 0.23109243697478993, "acc_norm_stderr": 0.027381406927868966 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436775, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436775 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24220183486238533, "acc_stderr": 0.01836817630659862, "acc_norm": 0.24220183486238533, "acc_norm_stderr": 0.01836817630659862 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.16203703703703703, "acc_stderr": 0.02513045365226846, "acc_norm": 0.16203703703703703, "acc_norm_stderr": 0.02513045365226846 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23529411764705882, "acc_stderr": 0.029771775228145628, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.029771775228145628 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2616033755274262, "acc_stderr": 0.028609516716994934, "acc_norm": 0.2616033755274262, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3273542600896861, "acc_stderr": 0.031493846709941306, "acc_norm": 0.3273542600896861, "acc_norm_stderr": 0.031493846709941306 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.24793388429752067, "acc_stderr": 0.03941897526516303, "acc_norm": 0.24793388429752067, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2962962962962963, "acc_stderr": 0.04414343666854933, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.24539877300613497, "acc_stderr": 0.03380939813943354, "acc_norm": 0.24539877300613497, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.042878587513404544, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.042878587513404544 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690877, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690877 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.28735632183908044, "acc_stderr": 0.0161824107306827, "acc_norm": 0.28735632183908044, "acc_norm_stderr": 0.0161824107306827 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.014355911964767864, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.014355911964767864 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22875816993464052, "acc_stderr": 0.024051029739912258, "acc_norm": 0.22875816993464052, "acc_norm_stderr": 0.024051029739912258 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.02447722285613511, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2553191489361702, "acc_stderr": 0.02601199293090201, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.02601199293090201 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2392438070404172, "acc_stderr": 0.010896123652676651, "acc_norm": 0.2392438070404172, "acc_norm_stderr": 0.010896123652676651 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.20220588235294118, "acc_stderr": 0.02439819298665492, "acc_norm": 0.20220588235294118, "acc_norm_stderr": 0.02439819298665492 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2565359477124183, "acc_stderr": 0.01766784161237899, "acc_norm": 0.2565359477124183, "acc_norm_stderr": 0.01766784161237899 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.34545454545454546, "acc_stderr": 0.04554619617541054, "acc_norm": 0.34545454545454546, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.17142857142857143, "acc_stderr": 0.02412746346265015, "acc_norm": 0.17142857142857143, "acc_norm_stderr": 0.02412746346265015 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409224, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409224 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.19879518072289157, "acc_stderr": 0.03106939026078943, "acc_norm": 0.19879518072289157, "acc_norm_stderr": 0.03106939026078943 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0312678171466318, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.22766217870257038, "mc1_stderr": 0.01467925503211107, "mc2": 0.4822967947124386, "mc2_stderr": 0.016765493106593414 }, "harness|winogrande|5": { "acc": 0.5043409629044988, "acc_stderr": 0.01405195606407689 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
joey234/mmlu-high_school_mathematics-original-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_prompt dtype: string splits: - name: test num_bytes: 15849 num_examples: 29 download_size: 15984 dataset_size: 15849 --- # Dataset Card for "mmlu-high_school_mathematics-original-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
heliosprime/twitter_dataset_1713110824
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 15675 num_examples: 45 download_size: 15799 dataset_size: 15675 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713110824" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
gguichard/wsd_myriade_synth_data_gpt4turbo_val_1_bge
--- dataset_info: features: - name: tokens sequence: string - name: wn_sens sequence: int64 - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 5365648 num_examples: 7903 download_size: 1136695 dataset_size: 5365648 configs: - config_name: default data_files: - split: train path: data/train-* ---
dododododo/test
--- license: apache-2.0 --- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/654907a4a1faff97850c4eff/9LfMe3bU93Vb4oNyArexx.png)
open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare_6x7
--- pretty_name: Evaluation run of mychen76/mistral-7b-merged-dare_6x7 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [mychen76/mistral-7b-merged-dare_6x7](https://huggingface.co/mychen76/mistral-7b-merged-dare_6x7)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare_6x7\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-12T01:27:30.665458](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare_6x7/blob/main/results_2024-03-12T01-27-30.665458.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563139414530638,\n\ \ \"acc_stderr\": 0.031967569574421976,\n \"acc_norm\": 0.6562534942043537,\n\ \ \"acc_norm_stderr\": 0.03262528877791407,\n \"mc1\": 0.5067319461444308,\n\ \ \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6698288226697681,\n\ \ \"mc2_stderr\": 0.015121056875692264\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6689419795221843,\n \"acc_stderr\": 0.013752062419817837,\n\ \ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778768\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6946823341963753,\n\ \ \"acc_stderr\": 0.004596006250433551,\n \"acc_norm\": 0.870444134634535,\n\ \ \"acc_norm_stderr\": 0.003351278403392407\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\ \ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\ \ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\ : 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\ \ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\ \ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\ \ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\ : 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\ \ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\ \ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\ \ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\ : 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\ acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\ \ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \ \ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \ \ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028075,\n \"\ acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028075\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\ acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \ \ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\ \ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\ \ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\ \ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\ \ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\ \ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\ \ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\ \ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\ \ \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n\ \ \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\ \ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632952,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632952\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\"\ : 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n\ \ \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\ \ \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n\ \ \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\ \ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\ \ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\ \ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5067319461444308,\n\ \ \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6698288226697681,\n\ \ \"mc2_stderr\": 0.015121056875692264\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.01111698339239267\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \ \ \"acc_stderr\": 0.0124548416683377\n }\n}\n```" repo_url: https://huggingface.co/mychen76/mistral-7b-merged-dare_6x7 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|arc:challenge|25_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-12T01-27-30.665458.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|gsm8k|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hellaswag|10_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-12T01-27-30.665458.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-management|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T01-27-30.665458.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|truthfulqa:mc|0_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-12T01-27-30.665458.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_12T01_27_30.665458 path: - '**/details_harness|winogrande|5_2024-03-12T01-27-30.665458.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-12T01-27-30.665458.parquet' - config_name: results data_files: - split: 2024_03_12T01_27_30.665458 path: - results_2024-03-12T01-27-30.665458.parquet - split: latest path: - results_2024-03-12T01-27-30.665458.parquet --- # Dataset Card for Evaluation run of mychen76/mistral-7b-merged-dare_6x7 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [mychen76/mistral-7b-merged-dare_6x7](https://huggingface.co/mychen76/mistral-7b-merged-dare_6x7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare_6x7", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-12T01:27:30.665458](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare_6x7/blob/main/results_2024-03-12T01-27-30.665458.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6563139414530638, "acc_stderr": 0.031967569574421976, "acc_norm": 0.6562534942043537, "acc_norm_stderr": 0.03262528877791407, "mc1": 0.5067319461444308, "mc1_stderr": 0.017501914492655396, "mc2": 0.6698288226697681, "mc2_stderr": 0.015121056875692264 }, "harness|arc:challenge|25": { "acc": 0.6689419795221843, "acc_stderr": 0.013752062419817837, "acc_norm": 0.6962457337883959, "acc_norm_stderr": 0.013438909184778768 }, "harness|hellaswag|10": { "acc": 0.6946823341963753, "acc_stderr": 0.004596006250433551, "acc_norm": 0.870444134634535, "acc_norm_stderr": 0.003351278403392407 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086924, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086924 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.02911661760608301, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.02911661760608301 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7016806722689075, "acc_stderr": 0.02971914287634286, "acc_norm": 0.7016806722689075, "acc_norm_stderr": 0.02971914287634286 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8587155963302753, "acc_stderr": 0.014933868987028075, "acc_norm": 0.8587155963302753, "acc_norm_stderr": 0.014933868987028075 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.039166677628225836, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.039166677628225836 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4770949720670391, "acc_stderr": 0.016704945740326188, "acc_norm": 0.4770949720670391, "acc_norm_stderr": 0.016704945740326188 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.02495418432487991, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632952, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632952 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.02979071924382972, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.02979071924382972 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46284224250325945, "acc_stderr": 0.012734923579532069, "acc_norm": 0.46284224250325945, "acc_norm_stderr": 0.012734923579532069 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6985294117647058, "acc_stderr": 0.027875982114273168, "acc_norm": 0.6985294117647058, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.02853556033712844, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.02853556033712844 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5067319461444308, "mc1_stderr": 0.017501914492655396, "mc2": 0.6698288226697681, "mc2_stderr": 0.015121056875692264 }, "harness|winogrande|5": { "acc": 0.8058405682715075, "acc_stderr": 0.01111698339239267 }, "harness|gsm8k|5": { "acc": 0.7134192570128886, "acc_stderr": 0.0124548416683377 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
result-muse256-muse512-wuerst-sdv15/ac20e7b9
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 201 num_examples: 10 download_size: 1382 dataset_size: 201 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "ac20e7b9" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
LLM-Editor/prompt_template
--- dataset_info: features: - name: Question dtype: string splits: - name: train num_bytes: 4666 num_examples: 100 download_size: 3208 dataset_size: 4666 --- # Dataset Card for "prompt_template" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Nadav-Timor/PyPiBugs
--- license: other tags: - code - bugs - diff - repair size_categories: - 1K<n<10K --- See [Allamanis et al., 2021](https://arxiv.org/pdf/2105.12787.pdf) (NeurIPS 2021) for more information.
davanstrien/EncyclopaediaBritannica
Invalid username or password.
KAUE24122023/EricCartmanMartaRhaulin
--- license: openrail ---
thanaphatt1/LongAlpaca-16kcontext-enth-and-WikiQA
--- dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: file dtype: string splits: - name: train num_bytes: 1175520496.8399894 num_examples: 23801 download_size: 413457371 dataset_size: 1175520496.8399894 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_UCLA-AGI__test
--- pretty_name: Evaluation run of UCLA-AGI/test dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [UCLA-AGI/test](https://huggingface.co/UCLA-AGI/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-05T00:36:41.239145](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test/blob/main/results_2024-01-05T00-36-41.239145.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083070915056247,\n\ \ \"acc_stderr\": 0.032946994490981554,\n \"acc_norm\": 0.6144414847119565,\n\ \ \"acc_norm_stderr\": 0.03362118464961407,\n \"mc1\": 0.408812729498164,\n\ \ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5738893811625738,\n\ \ \"mc2_stderr\": 0.015990080392547533\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n\ \ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497728\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6759609639514041,\n\ \ \"acc_stderr\": 0.004670581884781161,\n \"acc_norm\": 0.8544114718183629,\n\ \ \"acc_norm_stderr\": 0.003519724163310889\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n\ \ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\ \ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\ \ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\ \ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\ \ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\ \ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\ \ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\ \ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\ \ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\ \ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\ acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\ \ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\ \ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\ \ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\ \ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\ \ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\ acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\ \ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\ \ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \ \ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\ acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\ acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"\ acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\"\ : 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n\ \ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\ \ 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"\ acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\ \ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\ \ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\ \ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\ \ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\ \ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\ \ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\ \ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\ \ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\ \ \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n\ \ \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\ \ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\ \ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\ \ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\ \ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \ \ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\ \ \"acc_stderr\": 0.012700582404768223,\n \"acc_norm\": 0.44784876140808344,\n\ \ \"acc_norm_stderr\": 0.012700582404768223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\ \ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\ \ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\ \ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\ \ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \ \ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\ \ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\ \ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\ \ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n\ \ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5738893811625738,\n\ \ \"mc2_stderr\": 0.015990080392547533\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30856709628506446,\n \ \ \"acc_stderr\": 0.012723076049815884\n }\n}\n```" repo_url: https://huggingface.co/UCLA-AGI/test leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|arc:challenge|25_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-05T00-36-41.239145.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|gsm8k|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hellaswag|10_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-36-41.239145.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-36-41.239145.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T00-36-41.239145.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_05T00_36_41.239145 path: - '**/details_harness|winogrande|5_2024-01-05T00-36-41.239145.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-05T00-36-41.239145.parquet' - config_name: results data_files: - split: 2024_01_05T00_36_41.239145 path: - results_2024-01-05T00-36-41.239145.parquet - split: latest path: - results_2024-01-05T00-36-41.239145.parquet --- # Dataset Card for Evaluation run of UCLA-AGI/test <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [UCLA-AGI/test](https://huggingface.co/UCLA-AGI/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-05T00:36:41.239145](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test/blob/main/results_2024-01-05T00-36-41.239145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6083070915056247, "acc_stderr": 0.032946994490981554, "acc_norm": 0.6144414847119565, "acc_norm_stderr": 0.03362118464961407, "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5738893811625738, "mc2_stderr": 0.015990080392547533 }, "harness|arc:challenge|25": { "acc": 0.6168941979522184, "acc_stderr": 0.014206472661672876, "acc_norm": 0.658703071672355, "acc_norm_stderr": 0.013855831287497728 }, "harness|hellaswag|10": { "acc": 0.6759609639514041, "acc_stderr": 0.004670581884781161, "acc_norm": 0.8544114718183629, "acc_norm_stderr": 0.003519724163310889 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849724, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849724 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880263, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880263 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067884, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067884 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7290322580645161, "acc_stderr": 0.025284416114900156, "acc_norm": 0.7290322580645161, "acc_norm_stderr": 0.025284416114900156 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603489, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603489 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198896, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198896 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.024639789097709443, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.024639789097709443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5692307692307692, "acc_stderr": 0.025106820660539753, "acc_norm": 0.5692307692307692, "acc_norm_stderr": 0.025106820660539753 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6386554621848739, "acc_stderr": 0.03120469122515002, "acc_norm": 0.6386554621848739, "acc_norm_stderr": 0.03120469122515002 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696525, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.017381415563608674, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.017381415563608674 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.03344887382997867, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.03344887382997867 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.02786594228663933, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.02786594228663933 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.03915345408847836, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8058748403575989, "acc_stderr": 0.014143970276657567, "acc_norm": 0.8058748403575989, "acc_norm_stderr": 0.014143970276657567 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.02488314057007176, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.02488314057007176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.35307262569832404, "acc_stderr": 0.015984204545268565, "acc_norm": 0.35307262569832404, "acc_norm_stderr": 0.015984204545268565 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6928104575163399, "acc_stderr": 0.026415601914388992, "acc_norm": 0.6928104575163399, "acc_norm_stderr": 0.026415601914388992 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6635802469135802, "acc_stderr": 0.02628973494595293, "acc_norm": 0.6635802469135802, "acc_norm_stderr": 0.02628973494595293 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236837, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236837 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44784876140808344, "acc_stderr": 0.012700582404768223, "acc_norm": 0.44784876140808344, "acc_norm_stderr": 0.012700582404768223 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6176470588235294, "acc_stderr": 0.01965992249362335, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.01965992249362335 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6489795918367347, "acc_stderr": 0.030555316755573637, "acc_norm": 0.6489795918367347, "acc_norm_stderr": 0.030555316755573637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768914, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768914 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5738893811625738, "mc2_stderr": 0.015990080392547533 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183525 }, "harness|gsm8k|5": { "acc": 0.30856709628506446, "acc_stderr": 0.012723076049815884 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
jerryxu9001/cs6301project50k
--- license: mit dataset_info: features: - name: image dtype: image - name: expression dtype: string - name: img_width dtype: int64 - name: img_height dtype: int64 - name: x dtype: float64 - name: y dtype: float64 - name: w dtype: float64 - name: h dtype: float64 splits: - name: train num_bytes: 7128143566.0 num_examples: 40000 - name: test num_bytes: 1723596306.0 num_examples: 10000 download_size: 0 dataset_size: 8851739872.0 ---
joey234/mmlu-high_school_psychology-neg-prepend
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: negate_openai_prompt struct: - name: content dtype: string - name: role dtype: string - name: neg_question dtype: string - name: fewshot_context dtype: string - name: ori_prompt dtype: string - name: neg_prompt dtype: string - name: fewshot_context_neg dtype: string - name: fewshot_context_ori dtype: string splits: - name: dev num_bytes: 9644 num_examples: 5 - name: test num_bytes: 6027159 num_examples: 545 download_size: 475533 dataset_size: 6036803 configs: - config_name: default data_files: - split: dev path: data/dev-* - split: test path: data/test-* --- # Dataset Card for "mmlu-high_school_psychology-neg-prepend" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dynamicslab/hydrogym-checkpoints
--- license: cc-by-4.0 ---
open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Limarpv3
--- pretty_name: Evaluation run of TeeZee/Xwin-LM-70B-V0.1_Limarpv3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TeeZee/Xwin-LM-70B-V0.1_Limarpv3](https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Limarpv3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Limarpv3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-25T10:59:31.899107](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Limarpv3/blob/main/results_2024-01-25T10-59-31.899107.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6907434014004448,\n\ \ \"acc_stderr\": 0.030406546643218627,\n \"acc_norm\": 0.6960374849812471,\n\ \ \"acc_norm_stderr\": 0.03098820704077189,\n \"mc1\": 0.3818849449204406,\n\ \ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5715047295306395,\n\ \ \"mc2_stderr\": 0.015147942199667246\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.013872423223718164,\n\ \ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403511\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6826329416450906,\n\ \ \"acc_stderr\": 0.004645003662067883,\n \"acc_norm\": 0.8697470623381797,\n\ \ \"acc_norm_stderr\": 0.0033589362798672655\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\ \ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\ \ \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \ \ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\ \ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\ : 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\ \ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\ \ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\ \ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\ \ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\ \ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\ \ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"\ acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"\ acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5467980295566502,\n \"acc_stderr\": 0.035025446508458714,\n \"\ acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.035025446508458714\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0301176889295036,\n\ \ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0301176889295036\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.898989898989899,\n \"acc_stderr\": 0.021469735576055346,\n \"\ acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055346\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\ \ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.02323458108842849,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.02323458108842849\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\ : {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n\ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.726890756302521,\n \"acc_stderr\": 0.028942004040998167,\n \ \ \"acc_norm\": 0.726890756302521,\n \"acc_norm_stderr\": 0.028942004040998167\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\ acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8825688073394495,\n \"acc_stderr\": 0.01380278022737734,\n \"\ acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.01380278022737734\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\ acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\ acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065505,\n \ \ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065505\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\ \ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\ \ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\ \ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\ acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\ \ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\ \ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\ \ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\ \ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\ \ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n\ \ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n\ \ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\ \ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\ \ \"acc_stderr\": 0.02383930331139821,\n \"acc_norm\": 0.7717041800643086,\n\ \ \"acc_norm_stderr\": 0.02383930331139821\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850987,\n\ \ \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850987\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \ \ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5404172099087353,\n\ \ \"acc_stderr\": 0.012728446067669943,\n \"acc_norm\": 0.5404172099087353,\n\ \ \"acc_norm_stderr\": 0.012728446067669943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.02725720260611495,\n\ \ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.02725720260611495\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.761437908496732,\n \"acc_stderr\": 0.017242385828779613,\n \ \ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.017242385828779613\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\ \ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\ \ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\ \ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \ \ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\ \ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\ \ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5715047295306395,\n\ \ \"mc2_stderr\": 0.015147942199667246\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267204\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.489764973464746,\n \ \ \"acc_stderr\": 0.013769598923012404\n }\n}\n```" repo_url: https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Limarpv3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|arc:challenge|25_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-25T10-59-31.899107.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|gsm8k|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hellaswag|10_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-31.899107.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-59-31.899107.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|truthfulqa:mc|0_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-25T10-59-31.899107.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_25T10_59_31.899107 path: - '**/details_harness|winogrande|5_2024-01-25T10-59-31.899107.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-25T10-59-31.899107.parquet' - config_name: results data_files: - split: 2024_01_25T10_59_31.899107 path: - results_2024-01-25T10-59-31.899107.parquet - split: latest path: - results_2024-01-25T10-59-31.899107.parquet --- # Dataset Card for Evaluation run of TeeZee/Xwin-LM-70B-V0.1_Limarpv3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TeeZee/Xwin-LM-70B-V0.1_Limarpv3](https://huggingface.co/TeeZee/Xwin-LM-70B-V0.1_Limarpv3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Limarpv3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-25T10:59:31.899107](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Xwin-LM-70B-V0.1_Limarpv3/blob/main/results_2024-01-25T10-59-31.899107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6907434014004448, "acc_stderr": 0.030406546643218627, "acc_norm": 0.6960374849812471, "acc_norm_stderr": 0.03098820704077189, "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5715047295306395, "mc2_stderr": 0.015147942199667246 }, "harness|arc:challenge|25": { "acc": 0.6569965870307167, "acc_stderr": 0.013872423223718164, "acc_norm": 0.7081911262798635, "acc_norm_stderr": 0.013284525292403511 }, "harness|hellaswag|10": { "acc": 0.6826329416450906, "acc_stderr": 0.004645003662067883, "acc_norm": 0.8697470623381797, "acc_norm_stderr": 0.0033589362798672655 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8289473684210527, "acc_stderr": 0.03064360707167709, "acc_norm": 0.8289473684210527, "acc_norm_stderr": 0.03064360707167709 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768081, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754407, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754407 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03309615177059006, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03309615177059006 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6416184971098265, "acc_stderr": 0.03656343653353159, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.03656343653353159 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6893617021276596, "acc_stderr": 0.03025123757921317, "acc_norm": 0.6893617021276596, "acc_norm_stderr": 0.03025123757921317 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6206896551724138, "acc_stderr": 0.04043461861916747, "acc_norm": 0.6206896551724138, "acc_norm_stderr": 0.04043461861916747 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.02559185776138218, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.02559185776138218 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8129032258064516, "acc_stderr": 0.022185710092252252, "acc_norm": 0.8129032258064516, "acc_norm_stderr": 0.022185710092252252 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5467980295566502, "acc_stderr": 0.035025446508458714, "acc_norm": 0.5467980295566502, "acc_norm_stderr": 0.035025446508458714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8181818181818182, "acc_stderr": 0.0301176889295036, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.0301176889295036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.898989898989899, "acc_stderr": 0.021469735576055346, "acc_norm": 0.898989898989899, "acc_norm_stderr": 0.021469735576055346 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9430051813471503, "acc_stderr": 0.01673108529360755, "acc_norm": 0.9430051813471503, "acc_norm_stderr": 0.01673108529360755 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7, "acc_stderr": 0.02323458108842849, "acc_norm": 0.7, "acc_norm_stderr": 0.02323458108842849 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.726890756302521, "acc_stderr": 0.028942004040998167, "acc_norm": 0.726890756302521, "acc_norm_stderr": 0.028942004040998167 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.45695364238410596, "acc_stderr": 0.04067325174247443, "acc_norm": 0.45695364238410596, "acc_norm_stderr": 0.04067325174247443 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8825688073394495, "acc_stderr": 0.01380278022737734, "acc_norm": 0.8825688073394495, "acc_norm_stderr": 0.01380278022737734 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5648148148148148, "acc_stderr": 0.033812000056435254, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.033812000056435254 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813902, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813902 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065505, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065505 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7847533632286996, "acc_stderr": 0.027584066602208274, "acc_norm": 0.7847533632286996, "acc_norm_stderr": 0.027584066602208274 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8320610687022901, "acc_stderr": 0.032785485373431386, "acc_norm": 0.8320610687022901, "acc_norm_stderr": 0.032785485373431386 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8760330578512396, "acc_stderr": 0.03008309871603521, "acc_norm": 0.8760330578512396, "acc_norm_stderr": 0.03008309871603521 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8159509202453987, "acc_stderr": 0.030446777687971726, "acc_norm": 0.8159509202453987, "acc_norm_stderr": 0.030446777687971726 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.03760178006026621, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.03760178006026621 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092365, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8659003831417624, "acc_stderr": 0.012185528166499978, "acc_norm": 0.8659003831417624, "acc_norm_stderr": 0.012185528166499978 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7803468208092486, "acc_stderr": 0.022289638852617893, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.022289638852617893 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5463687150837989, "acc_stderr": 0.016650437588269076, "acc_norm": 0.5463687150837989, "acc_norm_stderr": 0.016650437588269076 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958157, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958157 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7717041800643086, "acc_stderr": 0.02383930331139821, "acc_norm": 0.7717041800643086, "acc_norm_stderr": 0.02383930331139821 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.845679012345679, "acc_stderr": 0.020100830999850987, "acc_norm": 0.845679012345679, "acc_norm_stderr": 0.020100830999850987 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5319148936170213, "acc_stderr": 0.02976667507587387, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.02976667507587387 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5404172099087353, "acc_stderr": 0.012728446067669943, "acc_norm": 0.5404172099087353, "acc_norm_stderr": 0.012728446067669943 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7205882352941176, "acc_stderr": 0.02725720260611495, "acc_norm": 0.7205882352941176, "acc_norm_stderr": 0.02725720260611495 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.761437908496732, "acc_stderr": 0.017242385828779613, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.017242385828779613 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8, "acc_stderr": 0.02560737598657916, "acc_norm": 0.8, "acc_norm_stderr": 0.02560737598657916 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.025172984350155754, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.025172984350155754 }, "harness|truthfulqa:mc|0": { "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163495, "mc2": 0.5715047295306395, "mc2_stderr": 0.015147942199667246 }, "harness|winogrande|5": { "acc": 0.8176795580110497, "acc_stderr": 0.010851565594267204 }, "harness|gsm8k|5": { "acc": 0.489764973464746, "acc_stderr": 0.013769598923012404 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
turing-motors/LLaVA-Pretrain-JA
--- license: other task_categories: - visual-question-answering - question-answering language: - ja pretty_name: Japanese LLaVA Pretrain size_categories: - 100K<n<1M --- ## Dataset Details **Dataset Type:** Japanese LLaVA Pretrain is a localized version of the original LLaVA Pretrain dataset. This version is translated into Japanese using DeepL API and is aimed at serving similar purposes in the context of Japanese language. **Resources for More Information:** For information on the original dataset: [LLaVA](https://llava-vl.github.io/) **License:** License: Must comply with license of CC-3M, BLIP (if you use their synthetic caption). CC-3M The dataset may be freely used for any purpose, although acknowledgement of Google LLC ("Google") as the data source would be appreciated. The dataset is provided "AS IS" without any warranty, express or implied. Google disclaims all liability for any damages, direct or indirect, resulting from the use of the dataset. Same as [the original dataset](https://huggingface.co/datasets/liuhaotian/LLaVA-Pretrain). **Questions or Comments:** For questions or comments about the original model, you can go to [LLaVA GitHub Issues](https://github.com/haotian-liu/LLaVA/issues). ## Intended Use **Primary Intended Uses:** The primary use of this translated dataset is research on large multimodal models and chatbots in a Japanese context. **Primary Intended Users:** The primary intended users are researchers and hobbyists interested in computer vision, natural language processing, machine learning, and artificial intelligence, particularly those focusing on the Japanese language. --- **Note:** This dataset is a translation of the original LLaVA-Pretrain, carried out using the DeepL API. The license remains the same as the original dataset. ---
open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic
--- pretty_name: Evaluation run of hedronstone/OpenHermes-7B-Symbolic dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [hedronstone/OpenHermes-7B-Symbolic](https://huggingface.co/hedronstone/OpenHermes-7B-Symbolic)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-11T06:22:23.753929](https://huggingface.co/datasets/open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic/blob/main/results_2023-12-11T06-22-23.753929.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6264107604791354,\n\ \ \"acc_stderr\": 0.03244629935008131,\n \"acc_norm\": 0.6296805420206979,\n\ \ \"acc_norm_stderr\": 0.03308716638008267,\n \"mc1\": 0.33047735618115054,\n\ \ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.48821727865548903,\n\ \ \"mc2_stderr\": 0.0150448263523402\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\ \ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042196\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6386178052180841,\n\ \ \"acc_stderr\": 0.004794191785967951,\n \"acc_norm\": 0.8273252340171281,\n\ \ \"acc_norm_stderr\": 0.0037719340427991577\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\ \ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\ \ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\ \ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\ : 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\ \ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\ \ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\ \ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\ \ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\ \ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406786,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406786\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\ \ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\ \ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"\ acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\ acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\ : 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709447,\n\ \ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709447\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \ \ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \ \ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188705,\n \ \ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188705\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\ acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\ acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \ \ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\ \ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\ \ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\ \ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\ acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\ \ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\ \ \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n\ \ \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\ \ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\ \ \"acc_stderr\": 0.014422292204808842,\n \"acc_norm\": 0.24692737430167597,\n\ \ \"acc_norm_stderr\": 0.014422292204808842\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\ \ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\ \ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\ \ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \ \ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\ \ \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n\ \ \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\ \ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \ \ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n\ \ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\ \ \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n\ \ \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\ \ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\ \ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\ \ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\ \ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.48821727865548903,\n\ \ \"mc2_stderr\": 0.0150448263523402\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011879\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5344958301743745,\n \ \ \"acc_stderr\": 0.013739668147545915\n }\n}\n```" repo_url: https://huggingface.co/hedronstone/OpenHermes-7B-Symbolic leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|arc:challenge|25_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-11T06-22-23.753929.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|gsm8k|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hellaswag|10_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-11T06-22-23.753929.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-management|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T06-22-23.753929.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|truthfulqa:mc|0_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-11T06-22-23.753929.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_11T06_22_23.753929 path: - '**/details_harness|winogrande|5_2023-12-11T06-22-23.753929.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-11T06-22-23.753929.parquet' - config_name: results data_files: - split: 2023_12_11T06_22_23.753929 path: - results_2023-12-11T06-22-23.753929.parquet - split: latest path: - results_2023-12-11T06-22-23.753929.parquet --- # Dataset Card for Evaluation run of hedronstone/OpenHermes-7B-Symbolic ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/hedronstone/OpenHermes-7B-Symbolic - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [hedronstone/OpenHermes-7B-Symbolic](https://huggingface.co/hedronstone/OpenHermes-7B-Symbolic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-11T06:22:23.753929](https://huggingface.co/datasets/open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic/blob/main/results_2023-12-11T06-22-23.753929.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6264107604791354, "acc_stderr": 0.03244629935008131, "acc_norm": 0.6296805420206979, "acc_norm_stderr": 0.03308716638008267, "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698296, "mc2": 0.48821727865548903, "mc2_stderr": 0.0150448263523402 }, "harness|arc:challenge|25": { "acc": 0.5870307167235495, "acc_stderr": 0.014388344935398326, "acc_norm": 0.6313993174061433, "acc_norm_stderr": 0.014097810678042196 }, "harness|hellaswag|10": { "acc": 0.6386178052180841, "acc_stderr": 0.004794191785967951, "acc_norm": 0.8273252340171281, "acc_norm_stderr": 0.0037719340427991577 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.0373362665538351, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.0373362665538351 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.03255525359340355, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.03255525359340355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406786, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406786 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.02354079935872329, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.02354079935872329 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8181818181818182, "acc_stderr": 0.0274796030105388, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.0274796030105388 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.024639789097709447, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.024639789097709447 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6333333333333333, "acc_stderr": 0.02443301646605246, "acc_norm": 0.6333333333333333, "acc_norm_stderr": 0.02443301646605246 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.027738969632176088, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.027738969632176088 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188705, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188705 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507338, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507338 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477518, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477518 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.038808483010823944, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281382, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281382 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8160919540229885, "acc_stderr": 0.013853724170922524, "acc_norm": 0.8160919540229885, "acc_norm_stderr": 0.013853724170922524 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7023121387283237, "acc_stderr": 0.024617055388677003, "acc_norm": 0.7023121387283237, "acc_norm_stderr": 0.024617055388677003 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808842, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808842 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765137, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4432624113475177, "acc_stderr": 0.029634838473766006, "acc_norm": 0.4432624113475177, "acc_norm_stderr": 0.029634838473766006 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.012739711554045704, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.012739711554045704 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824866, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824866 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6535947712418301, "acc_stderr": 0.01924978569171721, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.01924978569171721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274645, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274645 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623326, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623326 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698296, "mc2": 0.48821727865548903, "mc2_stderr": 0.0150448263523402 }, "harness|winogrande|5": { "acc": 0.7584846093133386, "acc_stderr": 0.012028983782011879 }, "harness|gsm8k|5": { "acc": 0.5344958301743745, "acc_stderr": 0.013739668147545915 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_u-chom__ex-llm-e1
--- pretty_name: Evaluation run of u-chom/ex-llm-e1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [u-chom/ex-llm-e1](https://huggingface.co/u-chom/ex-llm-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_u-chom__ex-llm-e1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-09T14:50:53.053467](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__ex-llm-e1/blob/main/results_2023-12-09T14-50-53.053467.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39402023545271236,\n\ \ \"acc_stderr\": 0.03431633094943852,\n \"acc_norm\": 0.3992950895868925,\n\ \ \"acc_norm_stderr\": 0.03515648307530415,\n \"mc1\": 0.2631578947368421,\n\ \ \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.4200995329344425,\n\ \ \"mc2_stderr\": 0.01434315654117436\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.35921501706484643,\n \"acc_stderr\": 0.014020224155839159,\n\ \ \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946698\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5060744871539534,\n\ \ \"acc_stderr\": 0.004989413158034801,\n \"acc_norm\": 0.6811392152957578,\n\ \ \"acc_norm_stderr\": 0.004650825168905203\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\ \ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\ \ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236397,\n\ \ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236397\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\ \ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n\ \ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\ \ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.3680555555555556,\n\ \ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\ \ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\ \ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\ \ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745643,\n\ \ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745643\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.040434618619167466,\n\ \ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.040434618619167466\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\ acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3709677419354839,\n\ \ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.3709677419354839,\n\ \ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n\ \ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\ : 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.0389853160557942,\n\ \ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.0389853160557942\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"\ acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\ \ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.02345467488940429,\n\ \ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.02345467488940429\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \ \ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188705,\n \ \ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188705\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.44220183486238535,\n \"acc_stderr\": 0.02129361320752021,\n \"\ acc_norm\": 0.44220183486238535,\n \"acc_norm_stderr\": 0.02129361320752021\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\ acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"\ acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.510548523206751,\n \"acc_stderr\": 0.032539983791662855,\n \ \ \"acc_norm\": 0.510548523206751,\n \"acc_norm_stderr\": 0.032539983791662855\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\ \ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\ \ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.04356447202665069,\n\ \ \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.04356447202665069\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\ : 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\ \ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\ \ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n\ \ \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\ \ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\ \ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.047211885060971716,\n\ \ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.047211885060971716\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n\ \ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n\ \ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465918,\n \ \ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465918\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n\ \ \"acc_stderr\": 0.017877498991072,\n \"acc_norm\": 0.508301404853129,\n\ \ \"acc_norm_stderr\": 0.017877498991072\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.026454578146931494,\n\ \ \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.026454578146931494\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n\ \ \"acc_stderr\": 0.015218109544410179,\n \"acc_norm\": 0.2927374301675978,\n\ \ \"acc_norm_stderr\": 0.015218109544410179\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.028245134024387285,\n\ \ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.028245134024387285\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.39228295819935693,\n\ \ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.39228295819935693,\n\ \ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.02740204204026994,\n\ \ \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.02740204204026994\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3262411347517731,\n \"acc_stderr\": 0.027968453043563168,\n \ \ \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.027968453043563168\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3305084745762712,\n\ \ \"acc_stderr\": 0.01201414210184297,\n \"acc_norm\": 0.3305084745762712,\n\ \ \"acc_norm_stderr\": 0.01201414210184297\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\ \ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.3709150326797386,\n \"acc_stderr\": 0.019542101564854114,\n \ \ \"acc_norm\": 0.3709150326797386,\n \"acc_norm_stderr\": 0.019542101564854114\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\ \ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\ \ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n\ \ \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4975124378109453,\n\ \ \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.4975124378109453,\n\ \ \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \ \ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\ \ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\ \ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n\ \ \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\ \ \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.4200995329344425,\n\ \ \"mc2_stderr\": 0.01434315654117436\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545135\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \ \ \"acc_stderr\": 0.005600987515237865\n }\n}\n```" repo_url: https://huggingface.co/u-chom/ex-llm-e1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|arc:challenge|25_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-09T14-50-53.053467.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|gsm8k|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hellaswag|10_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-50-53.053467.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|truthfulqa:mc|0_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-09T14-50-53.053467.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_09T14_50_53.053467 path: - '**/details_harness|winogrande|5_2023-12-09T14-50-53.053467.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-09T14-50-53.053467.parquet' - config_name: results data_files: - split: 2023_12_09T14_50_53.053467 path: - results_2023-12-09T14-50-53.053467.parquet - split: latest path: - results_2023-12-09T14-50-53.053467.parquet --- # Dataset Card for Evaluation run of u-chom/ex-llm-e1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/u-chom/ex-llm-e1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [u-chom/ex-llm-e1](https://huggingface.co/u-chom/ex-llm-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_u-chom__ex-llm-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-09T14:50:53.053467](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__ex-llm-e1/blob/main/results_2023-12-09T14-50-53.053467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.39402023545271236, "acc_stderr": 0.03431633094943852, "acc_norm": 0.3992950895868925, "acc_norm_stderr": 0.03515648307530415, "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.4200995329344425, "mc2_stderr": 0.01434315654117436 }, "harness|arc:challenge|25": { "acc": 0.35921501706484643, "acc_stderr": 0.014020224155839159, "acc_norm": 0.3993174061433447, "acc_norm_stderr": 0.014312094557946698 }, "harness|hellaswag|10": { "acc": 0.5060744871539534, "acc_stderr": 0.004989413158034801, "acc_norm": 0.6811392152957578, "acc_norm_stderr": 0.004650825168905203 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.46710526315789475, "acc_stderr": 0.04060127035236397, "acc_norm": 0.46710526315789475, "acc_norm_stderr": 0.04060127035236397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.41509433962264153, "acc_stderr": 0.03032594578928611, "acc_norm": 0.41509433962264153, "acc_norm_stderr": 0.03032594578928611 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3680555555555556, "acc_stderr": 0.040329990539607195, "acc_norm": 0.3680555555555556, "acc_norm_stderr": 0.040329990539607195 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117317, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117317 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3699421965317919, "acc_stderr": 0.036812296333943194, "acc_norm": 0.3699421965317919, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33191489361702126, "acc_stderr": 0.030783736757745643, "acc_norm": 0.33191489361702126, "acc_norm_stderr": 0.030783736757745643 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322004, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322004 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3793103448275862, "acc_stderr": 0.040434618619167466, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.040434618619167466 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2830687830687831, "acc_stderr": 0.023201392938194974, "acc_norm": 0.2830687830687831, "acc_norm_stderr": 0.023201392938194974 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3709677419354839, "acc_stderr": 0.027480541887953593, "acc_norm": 0.3709677419354839, "acc_norm_stderr": 0.027480541887953593 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3251231527093596, "acc_stderr": 0.032957975663112704, "acc_norm": 0.3251231527093596, "acc_norm_stderr": 0.032957975663112704 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.4727272727272727, "acc_stderr": 0.0389853160557942, "acc_norm": 0.4727272727272727, "acc_norm_stderr": 0.0389853160557942 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4444444444444444, "acc_stderr": 0.035402943770953675, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.035402943770953675 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5233160621761658, "acc_stderr": 0.03604513672442202, "acc_norm": 0.5233160621761658, "acc_norm_stderr": 0.03604513672442202 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.31025641025641026, "acc_stderr": 0.02345467488940429, "acc_norm": 0.31025641025641026, "acc_norm_stderr": 0.02345467488940429 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.02620276653465215, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.02620276653465215 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3067226890756303, "acc_stderr": 0.02995382389188705, "acc_norm": 0.3067226890756303, "acc_norm_stderr": 0.02995382389188705 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.44220183486238535, "acc_stderr": 0.02129361320752021, "acc_norm": 0.44220183486238535, "acc_norm_stderr": 0.02129361320752021 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3287037037037037, "acc_stderr": 0.03203614084670058, "acc_norm": 0.3287037037037037, "acc_norm_stderr": 0.03203614084670058 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03495624522015474, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03495624522015474 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.510548523206751, "acc_stderr": 0.032539983791662855, "acc_norm": 0.510548523206751, "acc_norm_stderr": 0.032539983791662855 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4618834080717489, "acc_stderr": 0.03346015011973228, "acc_norm": 0.4618834080717489, "acc_norm_stderr": 0.03346015011973228 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.44274809160305345, "acc_stderr": 0.04356447202665069, "acc_norm": 0.44274809160305345, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.0453793517794788, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.0453793517794788 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.42592592592592593, "acc_stderr": 0.0478034362693679, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.38650306748466257, "acc_stderr": 0.038258255488486076, "acc_norm": 0.38650306748466257, "acc_norm_stderr": 0.038258255488486076 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.042466243366976256, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.042466243366976256 }, "harness|hendrycksTest-management|5": { "acc": 0.34951456310679613, "acc_stderr": 0.047211885060971716, "acc_norm": 0.34951456310679613, "acc_norm_stderr": 0.047211885060971716 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5085470085470085, "acc_stderr": 0.0327513030009703, "acc_norm": 0.5085470085470085, "acc_norm_stderr": 0.0327513030009703 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465918, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465918 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.508301404853129, "acc_stderr": 0.017877498991072, "acc_norm": 0.508301404853129, "acc_norm_stderr": 0.017877498991072 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.407514450867052, "acc_stderr": 0.026454578146931494, "acc_norm": 0.407514450867052, "acc_norm_stderr": 0.026454578146931494 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2927374301675978, "acc_stderr": 0.015218109544410179, "acc_norm": 0.2927374301675978, "acc_norm_stderr": 0.015218109544410179 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.41830065359477125, "acc_stderr": 0.028245134024387285, "acc_norm": 0.41830065359477125, "acc_norm_stderr": 0.028245134024387285 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.39228295819935693, "acc_stderr": 0.027731258647011998, "acc_norm": 0.39228295819935693, "acc_norm_stderr": 0.027731258647011998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.41358024691358025, "acc_stderr": 0.02740204204026994, "acc_norm": 0.41358024691358025, "acc_norm_stderr": 0.02740204204026994 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3262411347517731, "acc_stderr": 0.027968453043563168, "acc_norm": 0.3262411347517731, "acc_norm_stderr": 0.027968453043563168 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3305084745762712, "acc_stderr": 0.01201414210184297, "acc_norm": 0.3305084745762712, "acc_norm_stderr": 0.01201414210184297 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.40808823529411764, "acc_stderr": 0.029855261393483924, "acc_norm": 0.40808823529411764, "acc_norm_stderr": 0.029855261393483924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3709150326797386, "acc_stderr": 0.019542101564854114, "acc_norm": 0.3709150326797386, "acc_norm_stderr": 0.019542101564854114 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.43636363636363634, "acc_stderr": 0.04750185058907297, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5020408163265306, "acc_stderr": 0.0320089533497105, "acc_norm": 0.5020408163265306, "acc_norm_stderr": 0.0320089533497105 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4975124378109453, "acc_stderr": 0.03535490150137288, "acc_norm": 0.4975124378109453, "acc_norm_stderr": 0.03535490150137288 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.57, "acc_stderr": 0.04975698519562427, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562427 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.038284011150790206, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.038284011150790206 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5029239766081871, "acc_stderr": 0.03834759370936839, "acc_norm": 0.5029239766081871, "acc_norm_stderr": 0.03834759370936839 }, "harness|truthfulqa:mc|0": { "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237009, "mc2": 0.4200995329344425, "mc2_stderr": 0.01434315654117436 }, "harness|winogrande|5": { "acc": 0.648776637726914, "acc_stderr": 0.013415981370545135 }, "harness|gsm8k|5": { "acc": 0.043214556482183475, "acc_stderr": 0.005600987515237865 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
gaizerick/emi
--- license: openrail ---
ivrit-ai/audio-vad
--- language: - he license: other size_categories: - 1M<n<10M task_categories: - audio-classification - voice-activity-detection extra_gated_prompt: 'You agree to the following license terms: This material and data is licensed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), The full text of the CC-BY 4.0 license is available at https://creativecommons.org/licenses/by/4.0/. Notwithstanding the foregoing, this material and data may only be used, modified and distributed for the express purpose of training AI models, and subject to the foregoing restriction. In addition, this material and data may not be used in order to create audiovisual material that simulates the voice or likeness of the specific individuals appearing or speaking in such materials and data (a “deep-fake”). To the extent this paragraph is inconsistent with the CC-BY-4.0 license, the terms of this paragraph shall govern. By downloading or using any of this material or data, you agree that the Project makes no representations or warranties in respect of the data, and shall have no liability in respect thereof. These disclaimers and limitations are in addition to any disclaimers and limitations set forth in the CC-BY-4.0 license itself. You understand that the project is only able to make available the materials and data pursuant to these disclaimers and limitations, and without such disclaimers and limitations the project would not be able to make available the materials and data for your use.' extra_gated_fields: I have read the license, and agree to its terms: checkbox dataset_info: features: - name: audio dtype: audio - name: episode dtype: string - name: source dtype: string - name: uuid dtype: string - name: attrs struct: - name: duration dtype: float64 - name: end dtype: float64 - name: license dtype: string - name: segment dtype: int64 - name: start dtype: float64 splits: - name: train num_bytes: 704608554540.66 num_examples: 5657270 download_size: 473125104970 dataset_size: 704608554540.66 configs: - config_name: default data_files: - split: train path: data/train-* --- ivrit.ai is a database of Hebrew audio and text content. **audio-base** contains the raw, unprocessed sources. **audio-vad** contains audio snippets generated by applying Silero VAD (https://github.com/snakers4/silero-vad) to the base dataset. **audio-transcripts** contains transcriptions for each snippet in the audio-vad dataset. The audio-base dataset contains data from the following sources: * Geekonomy (Podcast, https://geekonomy.net) * HaCongress (Podcast, https://hacongress.podbean.com/) * Idan Eretz's YouTube channel (https://www.youtube.com/@IdanEretz) * Moneytime (Podcast, https://money-time.co.il) * Mor'e Nevohim (Podcast, https://open.spotify.com/show/1TZeexEk7n60LT1SlS2FE2?si=937266e631064a3c) * Yozevitch's World (Podcast, https://www.yozevitch.com/yozevitch-podcast) * NETfrix (Podcast, https://netfrix.podbean.com) * On Meaning (Podcast, https://mashmaut.buzzsprout.com) * Shnekel (Podcast, https://www.shnekel.live) * Bite-sized History (Podcast, https://soundcloud.com/historia-il) * Tziun 3 (Podcast, https://tziun3.co.il) * Academia Israel (https://www.youtube.com/@academiaisrael6115) * Shiluv Maagal (https://www.youtube.com/@ShiluvMaagal) Paper: https://arxiv.org/abs/2307.08720 If you use our datasets, the following quote is preferable: ``` @misc{marmor2023ivritai, title={ivrit.ai: A Comprehensive Dataset of Hebrew Speech for AI Research and Development}, author={Yanir Marmor and Kinneret Misgav and Yair Lifshitz}, year={2023}, eprint={2307.08720}, archivePrefix={arXiv}, primaryClass={eess.AS} } ```
open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B
--- pretty_name: Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Eurdem/Voltran-1.0-MoE-2x7B](https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-22T07:49:54.062079](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B/blob/main/results_2024-01-22T07-49-54.062079.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144636049773026,\n\ \ \"acc_stderr\": 0.033017267421085336,\n \"acc_norm\": 0.6168693020696908,\n\ \ \"acc_norm_stderr\": 0.033678924264574035,\n \"mc1\": 0.408812729498164,\n\ \ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5748009213372511,\n\ \ \"mc2_stderr\": 0.015610411040968409\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5955631399317406,\n \"acc_stderr\": 0.014342036483436179,\n\ \ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6444931288587931,\n\ \ \"acc_stderr\": 0.004776883632722614,\n \"acc_norm\": 0.837382991435969,\n\ \ \"acc_norm_stderr\": 0.0036826171219143085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \ \ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\ acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\ \ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\ : 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\ acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644823,\n \ \ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644823\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\ \ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\ \ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\ acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\ \ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\ \ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\ \ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\ \ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\ acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\ \ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\ \ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\ \ \"acc_stderr\": 0.02822949732031721,\n \"acc_norm\": 0.5612903225806452,\n\ \ \"acc_norm_stderr\": 0.02822949732031721\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\ \ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940788,\n\ \ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940788\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \ \ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\ \ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"\ acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\ acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \ \ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\ \ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\ \ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\ \ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\ acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\ \ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\ \ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\ \ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\ \ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242836,\n\ \ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242836\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\ \ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\ \ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\ \ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\ \ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\ \ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\ \ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \ \ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\ \ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\ \ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\ \ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \ \ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\ \ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\ \ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\ \ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\ \ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\ \ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n\ \ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5748009213372511,\n\ \ \"mc2_stderr\": 0.015610411040968409\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5595147839272175,\n \ \ \"acc_stderr\": 0.013674572131693888\n }\n}\n```" repo_url: https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|arc:challenge|25_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-22T07-49-54.062079.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|gsm8k|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hellaswag|10_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T07-49-54.062079.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|truthfulqa:mc|0_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-22T07-49-54.062079.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_22T07_49_54.062079 path: - '**/details_harness|winogrande|5_2024-01-22T07-49-54.062079.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-22T07-49-54.062079.parquet' - config_name: results data_files: - split: 2024_01_22T07_49_54.062079 path: - results_2024-01-22T07-49-54.062079.parquet - split: latest path: - results_2024-01-22T07-49-54.062079.parquet --- # Dataset Card for Evaluation run of Eurdem/Voltran-1.0-MoE-2x7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Eurdem/Voltran-1.0-MoE-2x7B](https://huggingface.co/Eurdem/Voltran-1.0-MoE-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-22T07:49:54.062079](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Voltran-1.0-MoE-2x7B/blob/main/results_2024-01-22T07-49-54.062079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6144636049773026, "acc_stderr": 0.033017267421085336, "acc_norm": 0.6168693020696908, "acc_norm_stderr": 0.033678924264574035, "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5748009213372511, "mc2_stderr": 0.015610411040968409 }, "harness|arc:challenge|25": { "acc": 0.5955631399317406, "acc_stderr": 0.014342036483436179, "acc_norm": 0.6407849829351536, "acc_norm_stderr": 0.014020224155839162 }, "harness|hellaswag|10": { "acc": 0.6444931288587931, "acc_stderr": 0.004776883632722614, "acc_norm": 0.837382991435969, "acc_norm_stderr": 0.0036826171219143085 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.038234289699266046, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.038234289699266046 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.029067220146644823, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.029067220146644823 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5612903225806452, "acc_stderr": 0.02822949732031721, "acc_norm": 0.5612903225806452, "acc_norm_stderr": 0.02822949732031721 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.02649905770139744, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.02649905770139744 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5897435897435898, "acc_stderr": 0.024939313906940788, "acc_norm": 0.5897435897435898, "acc_norm_stderr": 0.024939313906940788 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473072, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473072 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.031499305777849054, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.031499305777849054 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8201834862385321, "acc_stderr": 0.01646534546739152, "acc_norm": 0.8201834862385321, "acc_norm_stderr": 0.01646534546739152 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695063, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695063 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6502242152466368, "acc_stderr": 0.03200736719484503, "acc_norm": 0.6502242152466368, "acc_norm_stderr": 0.03200736719484503 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097653, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097653 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.035590395316173425, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8109833971902938, "acc_stderr": 0.014000791294406999, "acc_norm": 0.8109833971902938, "acc_norm_stderr": 0.014000791294406999 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6734104046242775, "acc_stderr": 0.025248264774242836, "acc_norm": 0.6734104046242775, "acc_norm_stderr": 0.025248264774242836 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4324022346368715, "acc_stderr": 0.01656897123354861, "acc_norm": 0.4324022346368715, "acc_norm_stderr": 0.01656897123354861 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6928104575163399, "acc_stderr": 0.026415601914388992, "acc_norm": 0.6928104575163399, "acc_norm_stderr": 0.026415601914388992 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464492, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464492 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.02973659252642444, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4452411994784876, "acc_stderr": 0.012693421303973294, "acc_norm": 0.4452411994784876, "acc_norm_stderr": 0.012693421303973294 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824866, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824866 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.019117213911495144, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.019117213911495144 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5323383084577115, "acc_stderr": 0.03528131472933607, "acc_norm": 0.5323383084577115, "acc_norm_stderr": 0.03528131472933607 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5748009213372511, "mc2_stderr": 0.015610411040968409 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237986 }, "harness|gsm8k|5": { "acc": 0.5595147839272175, "acc_stderr": 0.013674572131693888 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
deu05232/multiwoz_v23
--- dataset_info: features: - name: intent sequence: string - name: text dtype: string splits: - name: train num_bytes: 5836889 num_examples: 54176 - name: validation num_bytes: 777785 num_examples: 7084 - name: test num_bytes: 772136 num_examples: 7056 download_size: 0 dataset_size: 7386810 --- # Dataset Card for "multiwoz_v23" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
theojiang/image-text-dataset-subset-300k-captions_text
--- dataset_info: features: - name: image dtype: image - name: caption dtype: string splits: - name: train num_bytes: 49287648196.36 num_examples: 380536 download_size: 49537329244 dataset_size: 49287648196.36 configs: - config_name: default data_files: - split: train path: data/train-* ---
minoassad/SDhistory
--- license: afl-3.0 ---
open-llm-leaderboard/details_Test157t__Copium-Cola-9B
--- pretty_name: Evaluation run of Test157t/Copium-Cola-9B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Test157t/Copium-Cola-9B](https://huggingface.co/Test157t/Copium-Cola-9B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Copium-Cola-9B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-05T01:59:32.119072](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Copium-Cola-9B/blob/main/results_2024-03-05T01-59-32.119072.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6526140215006031,\n\ \ \"acc_stderr\": 0.03217279241801859,\n \"acc_norm\": 0.6532485904493103,\n\ \ \"acc_norm_stderr\": 0.0328355773349137,\n \"mc1\": 0.5226438188494492,\n\ \ \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6859620372982971,\n\ \ \"mc2_stderr\": 0.015124181113306925\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\ \ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537377\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7006572395937064,\n\ \ \"acc_stderr\": 0.004570342034463277,\n \"acc_norm\": 0.874228241386178,\n\ \ \"acc_norm_stderr\": 0.0033091427273510897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\ \ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\ \ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\ \ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4365079365079365,\n \"acc_stderr\": 0.02554284681740049,\n \"\ acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.02554284681740049\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\ \ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\ \ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\ \ \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n\ \ \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\ \ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218964,\n \"\ acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218964\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \ \ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \ \ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\ acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\ acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\ acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\ acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\ \ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\ \ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\ \ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\ \ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\ \ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\ \ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n\ \ \"acc_stderr\": 0.016653875777524,\n \"acc_norm\": 0.4547486033519553,\n\ \ \"acc_norm_stderr\": 0.016653875777524\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\ \ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\ \ \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n\ \ \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\ \ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n\ \ \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6859620372982971,\n\ \ \"mc2_stderr\": 0.015124181113306925\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.01030920949818748\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6322971948445792,\n \ \ \"acc_stderr\": 0.01328163050339548\n }\n}\n```" repo_url: https://huggingface.co/Test157t/Copium-Cola-9B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|arc:challenge|25_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-05T01-59-32.119072.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|gsm8k|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hellaswag|10_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-59-32.119072.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-59-32.119072.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T01-59-32.119072.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_05T01_59_32.119072 path: - '**/details_harness|winogrande|5_2024-03-05T01-59-32.119072.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-05T01-59-32.119072.parquet' - config_name: results data_files: - split: 2024_03_05T01_59_32.119072 path: - results_2024-03-05T01-59-32.119072.parquet - split: latest path: - results_2024-03-05T01-59-32.119072.parquet --- # Dataset Card for Evaluation run of Test157t/Copium-Cola-9B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Test157t/Copium-Cola-9B](https://huggingface.co/Test157t/Copium-Cola-9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Test157t__Copium-Cola-9B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-05T01:59:32.119072](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Copium-Cola-9B/blob/main/results_2024-03-05T01-59-32.119072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6526140215006031, "acc_stderr": 0.03217279241801859, "acc_norm": 0.6532485904493103, "acc_norm_stderr": 0.0328355773349137, "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6859620372982971, "mc2_stderr": 0.015124181113306925 }, "harness|arc:challenge|25": { "acc": 0.6834470989761092, "acc_stderr": 0.013592431519068079, "acc_norm": 0.7141638225255973, "acc_norm_stderr": 0.013203196088537377 }, "harness|hellaswag|10": { "acc": 0.7006572395937064, "acc_stderr": 0.004570342034463277, "acc_norm": 0.874228241386178, "acc_norm_stderr": 0.0033091427273510897 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.04960449637488583, "acc_norm": 0.58, "acc_norm_stderr": 0.04960449637488583 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.032081157507886836, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.032081157507886836 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.02554284681740049, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.02554284681740049 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554963, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554963 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511657, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511657 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218964, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218964 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.02813325257881563, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.02813325257881563 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.40397350993377484, "acc_stderr": 0.040064856853653415, "acc_norm": 0.40397350993377484, "acc_norm_stderr": 0.040064856853653415 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507337, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507337 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596914, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596914 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8122605363984674, "acc_stderr": 0.013964393769899126, "acc_norm": 0.8122605363984674, "acc_norm_stderr": 0.013964393769899126 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258172, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258172 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4547486033519553, "acc_stderr": 0.016653875777524, "acc_norm": 0.4547486033519553, "acc_norm_stderr": 0.016653875777524 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137904, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137904 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653349, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653349 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699121, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6859620372982971, "mc2_stderr": 0.015124181113306925 }, "harness|winogrande|5": { "acc": 0.8397790055248618, "acc_stderr": 0.01030920949818748 }, "harness|gsm8k|5": { "acc": 0.6322971948445792, "acc_stderr": 0.01328163050339548 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Tamazight-NLP/translatewiki
--- configs: - config_name: en-zgh data_files: en-zgh.csv default: true - config_name: en-kab data_files: en-kab.csv - config_name: en-tzm data_files: en-tzm.csv - config_name: en-shi data_files: en-shi.csv - config_name: en-rif data_files: en-rif.csv - config_name: en-shy data_files: en-shy.csv task_categories: - translation - text2text-generation language: - ber - zgh - kab - tzm - shi - rif - shy - en license: cc-by-3.0 pretty_name: Translatewiki ---
saibo/bookcorpus_compact_1024_test
--- dataset_info: features: - name: text dtype: string - name: concept_with_offset dtype: string splits: - name: train num_bytes: 75334225 num_examples: 6160 download_size: 38920916 dataset_size: 75334225 --- # Dataset Card for "bookcorpus_compact_1024_test" 6160 samples randomly sampled from the shard9 of Bookcorpus_compact_1024 ```python from datasets import load_dataset from datasets import Dataset corpus_name="xxx" ds = load_dataset(corpus_name, split="train") shuffled_ds = ds.shuffle(seed=42) test_ds = Dataset.from_dict{shuffled_ds[:6160]} # len(ds)//10 ``` [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BuroIdentidadDigital/cedula_Profecional
--- license: c-uda ---
Adriatogi/graffiti
--- configs: - config_name: default data_files: - split: data path: data/data-* dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: data num_bytes: 20474851.0 num_examples: 120 download_size: 0 dataset_size: 20474851.0 --- # Dataset Card for "graffiti" Although the labels are all black in the preview, they do have respective bounding box labels for the graffiti as 1. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Hjallti/solar-strike
--- license: mit task_categories: - text-generation - text-classification - token-classification - question-answering language: - en tags: - sol pretty_name: solar-strike size_categories: - 1K<n<10K --- # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
dplutchok/llama2-train10
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5232.204722895273 num_examples: 10 download_size: 10634 dataset_size: 5232.204722895273 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "llama2-train10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
getawayfrommeXD/embedded_ner_tokens
--- dataset_info: features: - name: word dtype: string - name: label dtype: string - name: OOV dtype: bool - name: embedding sequence: float32 splits: - name: train num_bytes: 248048533 num_examples: 203621 - name: validation num_bytes: 62568404 num_examples: 51362 - name: test num_bytes: 56564938 num_examples: 46435 download_size: 130105515 dataset_size: 367181875 --- # Dataset Card for "embedded_ner_tokens" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
winograd_wsc
--- annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - cc-by-4.0 multilinguality: - monolingual size_categories: - n<1K source_datasets: - original task_categories: - multiple-choice task_ids: - multiple-choice-coreference-resolution paperswithcode_id: wsc pretty_name: Winograd Schema Challenge dataset_info: - config_name: wsc285 features: - name: text dtype: string - name: pronoun dtype: string - name: pronoun_loc dtype: int32 - name: quote dtype: string - name: quote_loc dtype: int32 - name: options sequence: string - name: label dtype: class_label: names: '0': '0' '1': '1' - name: source dtype: string splits: - name: test num_bytes: 52281 num_examples: 285 download_size: 113235 dataset_size: 52281 - config_name: wsc273 features: - name: text dtype: string - name: pronoun dtype: string - name: pronoun_loc dtype: int32 - name: quote dtype: string - name: quote_loc dtype: int32 - name: options sequence: string - name: label dtype: class_label: names: '0': '0' '1': '1' - name: source dtype: string splits: - name: test num_bytes: 49674 num_examples: 273 download_size: 113235 dataset_size: 49674 --- # Dataset Card for The Winograd Schema Challenge ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/WS.html - **Repository:** - **Paper:** https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.729.9814&rep=rep1&type=pdf - **Leaderboard:** - **Point of Contact:** ### Dataset Summary A Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is resolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its resolution. The schema takes its name from a well-known example by Terry Winograd: > The city councilmen refused the demonstrators a permit because they [feared/advocated] violence. If the word is ``feared'', then ``they'' presumably refers to the city council; if it is ``advocated'' then ``they'' presumably refers to the demonstrators. ### Supported Tasks and Leaderboards From the official webpage: > A contest, entitled the Winograd Schema Challenge was run once, in 2016. At that time, there was a cash prize offered for achieving human-level performance in the contest. Since then, the sponsor has withdrawn; therefore NO CASH PRIZES CAN BE OFFERED OR WILL BE AWARDED FOR ANY KIND OF PERFORMANCE OR ACHIEVEMENT ON THIS CHALLENGE. ### Languages The dataset is in English. [Translation of 12 WSs into Chinese ](https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/WSChinese.html)(translated by Wei Xu). Translations into Japanese, by Soichiro Tanaka, Rafal Rzepka, and Shiho Katajima\ **Translation changing English names to Japanese **[PDF ](https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/collection_ja.pdf)    [HTML](http://arakilab.media.eng.hokudai.ac.jp/~kabura/collection_ja.html)\ **Translation preserving English names** [PDF ](https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/collection_katakana.pdf)    [HTML](http://arakilab.media.eng.hokudai.ac.jp/~kabura/collection_katakana.html) [Translation into French, ](http://www.llf.cnrs.fr/winograd-fr)by Pascal Amsili and Olga Seminck [Winograd Schemas in Portuguese](https://sol.sbc.org.br/index.php/eniac/article/view/9334) by Gabriela Melo, Vinicius Imaizumi, and Fábio Cozman. [Mandarinograd: A Chinese Collection of Winograd Schemas](https://www.aclweb.org/anthology/2020.lrec-1.3) by Timothée Bernard and Ting Han, LREC-2020. ## Dataset Structure ### Data Instances Each instance contains a text passage with a designated pronoun and two possible answers indicating which entity in the passage the pronoun represents. An example instance looks like the following: ```python { 'label': 0, 'options': ['The city councilmen', 'The demonstrators'], 'pronoun': 'they', 'pronoun_loc': 63, 'quote': 'they feared violence', 'quote_loc': 63, 'source': '(Winograd 1972)', 'text': 'The city councilmen refused the demonstrators a permit because they feared violence.' } ``` ### Data Fields - `text` (str): The text sequence - `options` (list[str]): The two entity options that the pronoun may be referring to - `label` (int): The index of the correct option in the `options` field - `pronoun` (str): The pronoun in the sequence to be resolved - `pronoun_loc` (int): The starting position of the pronoun in the sequence - `quote` (str): The substr with the key action or context surrounding the pronoun - `quote_loc` (int): The starting position of the quote in the sequence - `source` (str): A description of the source who contributed the example ### Data Splits Only a test split is included. ## Dataset Creation ### Curation Rationale The Winograd Schema Challenge was proposed as an automated evaluation of an AI system's commonsense linguistic understanding. From the webpage: > The strengths of the challenge are that it is clear-cut, in that the answer to each schema is a binary choice; vivid, in that it is obvious to non-experts that a program that fails to get the right answers clearly has serious gaps in its understanding; and difficult, in that it is far beyond the current state of the art. ### Source Data #### Initial Data Collection and Normalization This data was manually written by experts such that the schemas are: - easily disambiguated by the human reader (ideally, so easily that the reader does not even notice that there is an ambiguity); - not solvable by simple techniques such as selectional restrictions; - Google-proof; that is, there is no obvious statistical test over text corpora that will reliably disambiguate these correctly. #### Who are the source language producers? This dataset has grown over time, and so was produced by a variety of lingustic and AI researchers. See the `source` field for the source of each instance. ### Annotations #### Annotation process Annotations are produced by the experts who construct the examples. #### Who are the annotators? See above. ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators This dataset has grown over time, and so was produced by a variety of lingustic and AI researchers. See the `source` field for the source of each instance. ### Licensing Information This work is licensed under a [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/). ### Citation Information The Winograd Schema Challenge including many of the examples here was proposed by [Levesque et al 2012](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.729.9814&rep=rep1&type=pdf): ``` @inproceedings{levesque2012winograd, title={The winograd schema challenge}, author={Levesque, Hector and Davis, Ernest and Morgenstern, Leora}, booktitle={Thirteenth International Conference on the Principles of Knowledge Representation and Reasoning}, year={2012}, organization={Citeseer} } ``` ### Contributions Thanks to [@joeddav](https://github.com/joeddav) for adding this dataset.
tyzhu/squad_qa_wrong_num_v5_full_recite_full_passage_first_permute_rerun
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: id dtype: string - name: title dtype: string - name: context dtype: string - name: question dtype: string - name: answers sequence: - name: text dtype: string - name: answer_start dtype: int32 - name: answer dtype: string - name: context_id dtype: string - name: correct_id dtype: string - name: inputs dtype: string - name: targets dtype: string splits: - name: train num_bytes: 8767142.788690874 num_examples: 4778 - name: validation num_bytes: 584108 num_examples: 300 download_size: 1757345 dataset_size: 9351250.788690874 --- # Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage_first_permute_rerun" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tonyassi/celebrity-1000-embeddings
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': Aaron Eckhart '1': Aaron Paul '2': Aaron Rodgers '3': Aaron Taylor-Johnson '4': Abbi Jacobson '5': Abhishek Bachchan '6': Abigail Breslin '7': Abigail Spencer '8': Adam Brody '9': Adam Devine '10': Adam Driver '11': Adam Lambert '12': Adam Levine '13': Adam Sandler '14': Adam Scott '15': Adele '16': Adrian Grenier '17': Adèle Exarchopoulos '18': Aidan Gillen '19': Aidan Turner '20': Aishwarya Rai '21': Aja Naomi King '22': Alden Ehrenreich '23': Aldis Hodge '24': Alec Baldwin '25': Alex Morgan '26': Alex Pettyfer '27': Alex Rodriguez '28': Alexander Skarsgård '29': Alexandra Daddario '30': Alfre Woodard '31': Alia Shawkat '32': Alice Braga '33': Alice Eve '34': Alicia Keys '35': Alicia Vikander '36': Alison Brie '37': Allison Janney '38': Allison Williams '39': Alyson Hannigan '40': Amanda Peet '41': Amanda Seyfried '42': Amandla Stenberg '43': Amber Heard '44': America Ferrera '45': Amy Adams '46': Amy Poehler '47': Amy Schumer '48': Ana de Armas '49': Andie MacDowell '50': Andrew Garfield '51': Andrew Lincoln '52': Andrew Scott '53': Andy Garcia '54': Andy Samberg '55': Andy Serkis '56': Angela Bassett '57': Angelina Jolie '58': Anna Camp '59': Anna Faris '60': Anna Kendrick '61': Anna Paquin '62': AnnaSophia Robb '63': Annabelle Wallis '64': Anne Hathaway '65': Anne Marie '66': Anne-Marie '67': Ansel Elgort '68': Anson Mount '69': Anthony Hopkins '70': Anthony Joshua '71': Anthony Mackie '72': Antonio Banderas '73': Anya Taylor-Joy '74': Ariana Grande '75': Armie Hammer '76': Ashley Judd '77': Ashton Kutcher '78': Aubrey Plaza '79': Auli'i Cravalho '80': Awkwafina '81': Barack Obama '82': Bella Hadid '83': Bella Thorne '84': Ben Barnes '85': Ben Mendelsohn '86': Ben Stiller '87': Ben Whishaw '88': Benedict Cumberbatch '89': Benedict Wong '90': Benicio del Toro '91': Bill Gates '92': Bill Hader '93': Bill Murray '94': Bill Pullman '95': Bill Skarsgård '96': Billie Eilish '97': Billie Lourd '98': Billy Crudup '99': Billy Porter '100': Blake Lively '101': Bob Odenkirk '102': Bonnie Wright '103': Boyd Holbrook '104': Brad Pitt '105': Bradley Cooper '106': Brendan Fraser '107': Brian Cox '108': Brie Larson '109': Brittany Snow '110': Bryan Cranston '111': Bryce Dallas Howard '112': Busy Philipps '113': Caitriona Balfe '114': Cameron Diaz '115': Camila Cabello '116': Camila Mendes '117': Cardi B '118': Carey Mulligan '119': Carla Gugino '120': Carrie Underwood '121': Casey Affleck '122': Cate Blanchett '123': Catherine Keener '124': Catherine Zeta-Jones '125': Celine Dion '126': Chace Crawford '127': Chadwick Boseman '128': Channing Tatum '129': Charlie Cox '130': Charlie Day '131': Charlie Hunnam '132': Charlie Plummer '133': Charlize Theron '134': Chiara Ferragni '135': Chiwetel Ejiofor '136': Chloe Bennet '137': Chloe Grace Moretz '138': Chloe Sevigny '139': Chloë Grace Moretz '140': Chloë Sevigny '141': Chris Cooper '142': Chris Evans '143': Chris Hemsworth '144': Chris Martin '145': Chris Messina '146': Chris Noth '147': Chris O'Dowd '148': Chris Pine '149': Chris Pratt '150': Chris Tucker '151': Chrissy Teigen '152': Christian Bale '153': Christian Slater '154': Christina Aguilera '155': Christina Applegate '156': Christina Hendricks '157': Christina Milian '158': Christina Ricci '159': Christine Baranski '160': Christoph Waltz '161': Christopher Plummer '162': Christopher Walken '163': Cillian Murphy '164': Claire Foy '165': Clive Owen '166': Clive Standen '167': Cobie Smulders '168': Colin Farrell '169': Colin Firth '170': Colin Hanks '171': Connie Britton '172': Conor McGregor '173': Constance Wu '174': Constance Zimmer '175': Courteney Cox '176': Cristiano Ronaldo '177': Daisy Ridley '178': Dak Prescott '179': Dakota Fanning '180': Dakota Johnson '181': Damian Lewis '182': Dan Stevens '183': Danai Gurira '184': Dane DeHaan '185': Daniel Craig '186': Daniel Dae Kim '187': Daniel Day-Lewis '188': Daniel Gillies '189': Daniel Kaluuya '190': Daniel Mays '191': Daniel Radcliffe '192': Danny DeVito '193': Darren Criss '194': Dave Bautista '195': Dave Franco '196': Dave Grohl '197': Daveed Diggs '198': David Attenborough '199': David Beckham '200': David Duchovny '201': David Harbour '202': David Oyelowo '203': David Schwimmer '204': David Tennant '205': David Thewlis '206': Dax Shepard '207': Debra Messing '208': Demi Lovato '209': Dennis Quaid '210': Denzel Washington '211': Dermot Mulroney '212': Dev Patel '213': Diane Keaton '214': Diane Kruger '215': Diane Lane '216': Diego Boneta '217': Diego Luna '218': Djimon Hounsou '219': Dolly Parton '220': Domhnall Gleeson '221': Dominic Cooper '222': Dominic Monaghan '223': Dominic West '224': Don Cheadle '225': Donald Glover '226': Donald Sutherland '227': Donald Trump '228': Dua Lipa '229': Dwayne "The Rock" Johnson '230': Dwayne Johnson '231': Dylan O'Brien '232': Ed Harris '233': Ed Helms '234': Ed Sheeran '235': Eddie Murphy '236': Eddie Redmayne '237': Edgar Ramirez '238': Edward Norton '239': Eiza Gonzalez '240': Eiza González '241': Elijah Wood '242': Elisabeth Moss '243': Elisha Cuthbert '244': Eliza Coupe '245': Elizabeth Banks '246': Elizabeth Debicki '247': Elizabeth Lail '248': Elizabeth McGovern '249': Elizabeth Moss '250': Elizabeth Olsen '251': Elle Fanning '252': Ellen DeGeneres '253': Ellen Page '254': Ellen Pompeo '255': Ellie Goulding '256': Elon Musk '257': Emile Hirsch '258': Emilia Clarke '259': Emilia Fox '260': Emily Beecham '261': Emily Blunt '262': Emily Browning '263': Emily Deschanel '264': Emily Hampshire '265': Emily Mortimer '266': Emily Ratajkowski '267': Emily VanCamp '268': Emily Watson '269': Emma Bunton '270': Emma Chamberlain '271': Emma Corrin '272': Emma Mackey '273': Emma Roberts '274': Emma Stone '275': Emma Thompson '276': Emma Watson '277': Emmanuelle Chriqui '278': Emmy Rossum '279': Eoin Macken '280': Eric Bana '281': Ethan Hawke '282': Eva Green '283': Eva Longoria '284': Eva Mendes '285': Evan Peters '286': Evan Rachel Wood '287': Evangeline Lilly '288': Ewan McGregor '289': Ezra Miller '290': Felicity Huffman '291': Felicity Jones '292': Finn Wolfhard '293': Florence Pugh '294': Florence Welch '295': Forest Whitaker '296': Freddie Highmore '297': Freddie Prinze Jr. '298': Freema Agyeman '299': Freida Pinto '300': Freya Allan '301': Gabrielle Union '302': Gael Garcia Bernal '303': Gael García Bernal '304': Gal Gadot '305': Garrett Hedlund '306': Gary Oldman '307': Gemma Arterton '308': Gemma Chan '309': Gemma Whelan '310': George Clooney '311': George Lucas '312': Gerard Butler '313': Giancarlo Esposito '314': Giannis Antetokounmpo '315': Gigi Hadid '316': Gillian Anderson '317': Gillian Jacobs '318': Gina Carano '319': Gina Gershon '320': Gina Rodriguez '321': Ginnifer Goodwin '322': Gisele Bundchen '323': Glenn Close '324': Grace Kelly '325': Greg Kinnear '326': Greta Gerwig '327': Greta Scacchi '328': Greta Thunberg '329': Gugu Mbatha-Raw '330': Guy Ritchie '331': Gwen Stefani '332': Gwendoline Christie '333': Gwyneth Paltrow '334': Hafthor Bjornsson '335': Hailee Steinfeld '336': Hailey Bieber '337': Haley Joel Osment '338': Halle Berry '339': Hannah Simone '340': Harrison Ford '341': Harry Styles '342': Harvey Weinstein '343': Hayden Panettiere '344': Hayley Atwell '345': Helen Hunt '346': Helen Mirren '347': Helena Bonham Carter '348': Henry Cavill '349': Henry Golding '350': Hilary Swank '351': Himesh Patel '352': Hozier '353': Hugh Bonneville '354': Hugh Dancy '355': Hugh Grant '356': Hugh Jackman '357': Hugh Laurie '358': Ian Somerhalder '359': Idris Elba '360': Imelda Staunton '361': Imogen Poots '362': Ioan Gruffudd '363': Isabella Rossellini '364': Isabelle Huppert '365': Isla Fisher '366': Issa Rae '367': Iwan Rheon '368': J.K. Rowling '369': J.K. Simmons '370': Jack Black '371': Jack Reynor '372': Jack Whitehall '373': Jackie Chan '374': Jada Pinkett Smith '375': Jaden Smith '376': Jaimie Alexander '377': Jake Gyllenhaal '378': Jake Johnson '379': Jake T. Austin '380': James Cameron '381': James Corden '382': James Franco '383': James Marsden '384': James McAvoy '385': James Norton '386': Jamie Bell '387': Jamie Chung '388': Jamie Dornan '389': Jamie Foxx '390': Jamie Lee Curtis '391': Jamie Oliver '392': Jane Fonda '393': Jane Krakowski '394': Jane Levy '395': Jane Lynch '396': Jane Seymour '397': Janelle Monáe '398': January Jones '399': Jared Leto '400': Jason Bateman '401': Jason Clarke '402': Jason Derulo '403': Jason Isaacs '404': Jason Momoa '405': Jason Mraz '406': Jason Schwartzman '407': Jason Segel '408': Jason Statham '409': Jason Sudeikis '410': Javier Bardem '411': Jay Baruchel '412': Jay-Z '413': Jeff Bezos '414': Jeff Bridges '415': Jeff Daniels '416': Jeff Goldblum '417': Jeffrey Dean Morgan '418': Jeffrey Donovan '419': Jeffrey Wright '420': Jemima Kirke '421': Jenna Coleman '422': Jenna Fischer '423': Jenna Ortega '424': Jennifer Aniston '425': Jennifer Connelly '426': Jennifer Coolidge '427': Jennifer Esposito '428': Jennifer Garner '429': Jennifer Hudson '430': Jennifer Lawrence '431': Jennifer Lopez '432': Jennifer Love Hewitt '433': Jenny Slate '434': Jeremy Irons '435': Jeremy Renner '436': Jeremy Strong '437': Jerry Seinfeld '438': Jesse Eisenberg '439': Jesse Metcalfe '440': Jesse Plemons '441': Jesse Tyler Ferguson '442': Jesse Williams '443': Jessica Alba '444': Jessica Biel '445': Jessica Chastain '446': Jessica Lange '447': Jessie Buckley '448': Jim Carrey '449': Jim Parsons '450': Joan Collins '451': Joan Cusack '452': Joanne Froggatt '453': Joaquin Phoenix '454': Jodie Comer '455': Jodie Foster '456': Joe Jonas '457': Joe Keery '458': Joel Edgerton '459': Joel Kinnaman '460': Joel McHale '461': John Boyega '462': John C. Reilly '463': John Cena '464': John Cho '465': John Cleese '466': John Corbett '467': John David Washington '468': John Goodman '469': John Hawkes '470': John Krasinski '471': John Legend '472': John Leguizamo '473': John Lithgow '474': John Malkovich '475': John Mayer '476': John Mulaney '477': John Oliver '478': John Slattery '479': John Travolta '480': John Turturro '481': Johnny Depp '482': Johnny Knoxville '483': Jon Bernthal '484': Jon Favreau '485': Jon Hamm '486': Jonah Hill '487': Jonathan Groff '488': Jonathan Majors '489': Jonathan Pryce '490': Jonathan Rhys Meyers '491': Jordan Peele '492': Jordana Brewster '493': Joseph Fiennes '494': Joseph Gordon-Levitt '495': Josh Allen '496': Josh Brolin '497': Josh Gad '498': Josh Hartnett '499': Josh Hutcherson '500': Josh Radnor '501': Jude Law '502': Judy Dench '503': Judy Greer '504': Julia Garner '505': Julia Louis-Dreyfus '506': Julia Roberts '507': Julia Stiles '508': Julian Casablancas '509': Julian McMahon '510': Julianna Margulies '511': Julianne Hough '512': Julianne Moore '513': Julianne Nicholson '514': Juliette Binoche '515': Juliette Lewis '516': Juno Temple '517': Jurnee Smollett '518': Justin Bartha '519': Justin Bieber '520': Justin Hartley '521': Justin Herbert '522': Justin Long '523': Justin Theroux '524': Justin Timberlake '525': KJ Apa '526': Kaitlyn Dever '527': Kaley Cuoco '528': Kanye West '529': Karl Urban '530': Kat Dennings '531': Kate Beckinsale '532': Kate Bosworth '533': Kate Hudson '534': Kate Mara '535': Kate Middleton '536': Kate Upton '537': Kate Walsh '538': Kate Winslet '539': Katee Sackhoff '540': Katherine Heigl '541': Katherine Langford '542': Katherine Waterston '543': Kathryn Hahn '544': Katie Holmes '545': Katie McGrath '546': Katy Perry '547': Kaya Scodelario '548': Keanu Reeves '549': Keegan-Michael Key '550': Keira Knightley '551': Keke Palmer '552': Kelly Clarkson '553': Kelly Macdonald '554': Kelly Marie Tran '555': Kelly Reilly '556': Kelly Ripa '557': Kelvin Harrison Jr. '558': Keri Russell '559': Kerry Washington '560': Kevin Bacon '561': Kevin Costner '562': Kevin Hart '563': Kevin Spacey '564': Ki Hong Lee '565': Kiefer Sutherland '566': Kieran Culkin '567': Kiernan Shipka '568': Kim Dickens '569': Kim Kardashian '570': Kirsten Dunst '571': Kit Harington '572': Kourtney Kardashian '573': Kristen Bell '574': Kristen Stewart '575': Kristen Wiig '576': Kristin Davis '577': Krysten Ritter '578': Kyle Chandler '579': Kylie Jenner '580': Kylie Minogue '581': Lady Gaga '582': Lake Bell '583': Lakeith Stanfield '584': Lamar Jackson '585': Lana Del Rey '586': Laura Dern '587': Laura Harrier '588': Laura Linney '589': Laura Prepon '590': Laurence Fishburne '591': Laverne Cox '592': LeBron James '593': Lea Michele '594': Lea Seydoux '595': Lee Pace '596': Leighton Meester '597': Lena Headey '598': Leonardo Da Vinci '599': Leonardo DiCaprio '600': Leslie Mann '601': Leslie Odom Jr. '602': Lewis Hamilton '603': Liam Hemsworth '604': Liam Neeson '605': Lili Reinhart '606': Lily Aldridge '607': Lily Allen '608': Lily Collins '609': Lily James '610': Lily Rabe '611': Lily Tomlin '612': Lin-Manuel Miranda '613': Linda Cardellini '614': Lionel Messi '615': Lisa Bonet '616': Lisa Kudrow '617': Liv Tyler '618': Lizzo '619': Logan Lerman '620': Lorde '621': Lucy Boynton '622': Lucy Hale '623': Lucy Lawless '624': Lucy Liu '625': Luke Evans '626': Luke Perry '627': Luke Wilson '628': Lupita Nyong'o '629': Léa Seydoux '630': Mackenzie Davis '631': Madelaine Petsch '632': Mads Mikkelsen '633': Mae Whitman '634': Maggie Gyllenhaal '635': Maggie Q '636': Maggie Siff '637': Maggie Smith '638': Mahershala Ali '639': Mahira Khan '640': Maisie Richardson-Sellers '641': Maisie Williams '642': Mandy Moore '643': Mandy Patinkin '644': Marc Anthony '645': Margaret Qualley '646': Margot Robbie '647': Maria Sharapova '648': Marion Cotillard '649': Marisa Tomei '650': Mariska Hargitay '651': Mark Hamill '652': Mark Ruffalo '653': Mark Strong '654': Mark Wahlberg '655': Mark Zuckerberg '656': Marlon Brando '657': Martin Freeman '658': Martin Scorsese '659': Mary Elizabeth Winstead '660': Mary J. Blige '661': Mary Steenburgen '662': Mary-Louise Parker '663': Matt Bomer '664': Matt Damon '665': Matt LeBlanc '666': Matt Smith '667': Matthew Fox '668': Matthew Goode '669': Matthew Macfadyen '670': Matthew McConaughey '671': Matthew Perry '672': Matthew Rhys '673': Matthew Stafford '674': Max Minghella '675': Maya Angelou '676': Maya Hawke '677': Maya Rudolph '678': Megan Fox '679': Megan Rapinoe '680': Meghan Markle '681': Mel Gibson '682': Melanie Lynskey '683': Melissa Benoist '684': Melissa McCarthy '685': Melonie Diaz '686': Meryl Streep '687': Mia Wasikowska '688': Michael B. Jordan '689': Michael C. Hall '690': Michael Caine '691': Michael Cera '692': Michael Cudlitz '693': Michael Douglas '694': Michael Ealy '695': Michael Fassbender '696': Michael Jordan '697': Michael Keaton '698': Michael Pena '699': Michael Peña '700': Michael Phelps '701': Michael Shannon '702': Michael Sheen '703': Michael Stuhlbarg '704': Michelle Dockery '705': Michelle Monaghan '706': Michelle Obama '707': Michelle Pfeiffer '708': Michelle Rodriguez '709': Michelle Williams '710': Michelle Yeoh '711': Michiel Huisman '712': Mila Kunis '713': Miles Teller '714': Milla Jovovich '715': Millie Bobby Brown '716': Milo Ventimiglia '717': Mindy Kaling '718': Miranda Cosgrove '719': Miranda Kerr '720': Mireille Enos '721': Molly Ringwald '722': Morgan Freeman '723': Mélanie Laurent '724': Naomi Campbell '725': Naomi Harris '726': Naomi Scott '727': Naomi Watts '728': Naomie Harris '729': Nas '730': Natalie Dormer '731': Natalie Imbruglia '732': Natalie Morales '733': Natalie Portman '734': Nathalie Emmanuel '735': Nathalie Portman '736': Nathan Fillion '737': Naya Rivera '738': Neil Patrick Harris '739': Neil deGrasse Tyson '740': Neve Campbell '741': Neymar Jr. '742': Nicholas Braun '743': Nicholas Hoult '744': Nick Jonas '745': Nick Kroll '746': Nick Offerman '747': Nick Robinson '748': Nicole Kidman '749': Nikolaj Coster-Waldau '750': Nina Dobrev '751': Noah Centineo '752': Noomi Rapace '753': Norman Reedus '754': Novak Djokovic '755': Octavia Spencer '756': Odessa Young '757': Odette Annable '758': Olivia Colman '759': Olivia Cooke '760': Olivia Holt '761': Olivia Munn '762': Olivia Wilde '763': Oprah Winfrey '764': Orlando Bloom '765': Oscar Isaac '766': Owen Wilson '767': Pablo Picasso '768': Patrick Dempsey '769': Patrick Mahomes '770': Patrick Stewart '771': Patrick Wilson '772': Paul Bettany '773': Paul Dano '774': Paul Giamatti '775': Paul McCartney '776': Paul Rudd '777': Paul Wesley '778': Paula Patton '779': Pedro Almodóvar '780': Pedro Pascal '781': Penelope Cruz '782': Penélope Cruz '783': Pete Davidson '784': Peter Dinklage '785': Phoebe Dynevor '786': Phoebe Waller-Bridge '787': Pierce Brosnan '788': Portia de Rossi '789': Priyanka Chopra '790': Quentin Tarantino '791': Rachel Bilson '792': Rachel Brosnahan '793': Rachel McAdams '794': Rachel Weisz '795': Rafe Spall '796': Rainn Wilson '797': Ralph Fiennes '798': Rami Malek '799': Rashida Jones '800': Ray Liotta '801': Ray Romano '802': Rebecca Ferguson '803': Rebecca Hall '804': Reese Witherspoon '805': Regina Hall '806': Regina King '807': Renee Zellweger '808': Renée Zellweger '809': Rhys Ifans '810': Ricardo Montalban '811': Richard Armitage '812': Richard Gere '813': Richard Jenkins '814': Richard Madden '815': Ricky Gervais '816': Ricky Martin '817': Rihanna '818': Riley Keough '819': Rita Ora '820': River Phoenix '821': Riz Ahmed '822': Rob Lowe '823': Robert Carlyle '824': Robert De Niro '825': Robert Downey Jr. '826': Robert Pattinson '827': Robert Sheehan '828': Robin Tunney '829': Robin Williams '830': Roger Federer '831': Rooney Mara '832': Rosamund Pike '833': Rosario Dawson '834': Rose Byrne '835': Rose Leslie '836': Roselyn Sanchez '837': Ruby Rose '838': Rupert Grint '839': Russell Brand '840': Russell Crowe '841': Russell Wilson '842': Ruth Bader Ginsburg '843': Ruth Wilson '844': Ryan Eggold '845': Ryan Gosling '846': Ryan Murphy '847': Ryan Phillippe '848': Ryan Reynolds '849': Ryan Seacrest '850': Salma Hayek '851': Sam Claflin '852': Sam Heughan '853': Sam Rockwell '854': Sam Smith '855': Samara Weaving '856': Samuel L. Jackson '857': Sandra Bullock '858': Sandra Oh '859': Saoirse Ronan '860': Sarah Gadon '861': Sarah Hyland '862': Sarah Jessica Parker '863': Sarah Michelle Gellar '864': Sarah Paulson '865': Sarah Silverman '866': Sarah Wayne Callies '867': Sasha Alexander '868': Scarlett Johansson '869': Scott Speedman '870': Sean Bean '871': Sebastian Stan '872': Selena Gomez '873': Selma Blair '874': Serena Williams '875': Seth MacFarlane '876': Seth Meyers '877': Seth Rogen '878': Shailene Woodley '879': Shakira '880': Shania Twain '881': Sharlto Copley '882': Shawn Mendes '883': Shia LaBeouf '884': Shiri Appleby '885': Shohreh Aghdashloo '886': Shonda Rhimes '887': Sienna Miller '888': Sigourney Weaver '889': Simon Baker '890': Simon Cowell '891': Simon Pegg '892': Simone Biles '893': Sofia Boutella '894': Sofia Vergara '895': Sophie Turner '896': Sophie Wessex '897': Stanley Tucci '898': Stephen Amell '899': Stephen Colbert '900': Stephen Curry '901': Stephen Dorff '902': Sterling K. Brown '903': Sterling Knight '904': Steve Carell '905': Steven Yeun '906': Susan Sarandon '907': Taika Waititi '908': Taraji P. Henson '909': Taron Egerton '910': Taylor Hill '911': Taylor Kitsch '912': Taylor Lautner '913': Taylor Schilling '914': Taylor Swift '915': Teresa Palmer '916': Terrence Howard '917': Tessa Thompson '918': Thandie Newton '919': The Weeknd '920': Theo James '921': Thomas Brodie-Sangster '922': Thomas Jane '923': Tiger Woods '924': Tilda Swinton '925': Tim Burton '926': Tim Cook '927': Timothee Chalamet '928': Timothy Olyphant '929': Timothy Spall '930': Timothée Chalamet '931': Tina Fey '932': Tobey Maguire '933': Toby Jones '934': Toby Kebbell '935': Toby Regbo '936': Tom Brady '937': Tom Brokaw '938': Tom Cavanagh '939': Tom Cruise '940': Tom Ellis '941': Tom Felton '942': Tom Hanks '943': Tom Hardy '944': Tom Hiddleston '945': Tom Holland '946': Tom Hollander '947': Tom Hopper '948': Tom Selleck '949': Toni Collette '950': Tony Hale '951': Topher Grace '952': Tracee Ellis Ross '953': Tyra Banks '954': Tyrese Gibson '955': Uma Thurman '956': Usain Bolt '957': Uzo Aduba '958': Vanessa Hudgens '959': Vanessa Kirby '960': Vera Farmiga '961': Victoria Pedretti '962': Viggo Mortensen '963': Vin Diesel '964': Vince Vaughn '965': Vincent Cassel '966': Vincent D'Onofrio '967': Vincent Kartheiser '968': Viola Davis '969': Walton Goggins '970': Wes Anderson '971': Wes Bentley '972': Whoopi Goldberg '973': Will Ferrell '974': Will Poulter '975': Willem Dafoe '976': William Jackson Harper '977': William Shatner '978': Winona Ryder '979': Woody Harrelson '980': Yara Shahidi '981': Yvonne Strahovski '982': Zac Efron '983': Zach Braff '984': Zach Galifianakis '985': Zachary Levi '986': Zachary Quinto '987': Zayn Malik '988': Zazie Beetz '989': Zendaya '990': Zoe Kazan '991': Zoe Kravitz '992': Zoe Saldana '993': Zoey Deutch '994': Zooey Deschanel '995': Zoë Kravitz '996': Zoë Saldana - name: embeddings sequence: float64 splits: - name: train num_bytes: 303785226.616 num_examples: 18184 download_size: 305594693 dataset_size: 303785226.616 configs: - config_name: default data_files: - split: train path: data/train-* --- # Celebrity 1000 Top 1000 celebrities. 18,184 images. 256x256. Square cropped to face. Embeddings generated with [tonyassi/celebrity-classifier](https://huggingface.co/tonyassi/celebrity-classifier).
SangChan/KCC_Profit_DataSet_v2
--- task_categories: - text-generation language: - ko pretty_name: ' KCC_Profit_DataSet_v2' tags: - chemistry --- # Dataset Card for "SangChan/KCC_Profit_DataSet_v2" ## Project Repo - Github Repo: [Beomi/KoAlpaca](https://github.com/Beomi/KoAlpaca) ## How to use ```python >>> from datasets import load_dataset >>> ds = load_dataset("SangChan/KCC_Profit_DataSet_v2", split="train") >>> ds Dataset({ features: ['instruction', 'input', 'output'], num_rows: 21155 }) ``` ```python >>> ds[0] {'instruction': 'KCC 담당자의 이름을 알려줘', 'input': 'KCC 담당자', 'output': '담당자는 박상찬 책임입니다.', ```
ericdoug/SFT4All
--- license: gpl-3.0 ---
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-44000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1012957 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
kmljt/RESIDE-6K
--- license: mit ---
AayushShah/SQL_PlainText_Combined
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: text dtype: string - name: target dtype: string splits: - name: train num_bytes: 349116676.7610253 num_examples: 306706 - name: test num_bytes: 38791374.23897472 num_examples: 34079 download_size: 98654951 dataset_size: 387908051.0 --- # Dataset Card for "SQL_PlainText_Combined" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bigbio/jnlpba
--- language: - en bigbio_language: - English license: cc-by-3.0 multilinguality: monolingual bigbio_license_shortname: CC_BY_3p0 pretty_name: JNLPBA homepage: http://www.geniaproject.org/shared-tasks/bionlp-jnlpba-shared-task-2004 bigbio_pubmed: True bigbio_public: True bigbio_tasks: - NAMED_ENTITY_RECOGNITION --- # Dataset Card for JNLPBA ## Dataset Description - **Homepage:** http://www.geniaproject.org/shared-tasks/bionlp-jnlpba-shared-task-2004 - **Pubmed:** True - **Public:** True - **Tasks:** NER NER For Bio-Entities ## Citation Information ``` @inproceedings{collier-kim-2004-introduction, title = "Introduction to the Bio-entity Recognition Task at {JNLPBA}", author = "Collier, Nigel and Kim, Jin-Dong", booktitle = "Proceedings of the International Joint Workshop on Natural Language Processing in Biomedicine and its Applications ({NLPBA}/{B}io{NLP})", month = aug # " 28th and 29th", year = "2004", address = "Geneva, Switzerland", publisher = "COLING", url = "https://aclanthology.org/W04-1213", pages = "73--78", } ```
open-llm-leaderboard/details_touqir__Cyrax-7B
--- pretty_name: Evaluation run of touqir/Cyrax-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [touqir/Cyrax-7B](https://huggingface.co/touqir/Cyrax-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_touqir__Cyrax-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-14T23:34:58.336806](https://huggingface.co/datasets/open-llm-leaderboard/details_touqir__Cyrax-7B/blob/main/results_2024-02-14T23-34-58.336806.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527021919239829,\n\ \ \"acc_stderr\": 0.03198108902065119,\n \"acc_norm\": 0.6514575047354266,\n\ \ \"acc_norm_stderr\": 0.032651586646568864,\n \"mc1\": 0.6303549571603427,\n\ \ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7701130353030725,\n\ \ \"mc2_stderr\": 0.01431543014964792\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274777,\n\ \ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7457677753435571,\n\ \ \"acc_stderr\": 0.004345388614520019,\n \"acc_norm\": 0.8818960366460864,\n\ \ \"acc_norm_stderr\": 0.003220716126685038\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\ acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\ \ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\ \ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\ \ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\ \ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\ \ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \ \ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \ \ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\ acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\ acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\ acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\ \ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\ \ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\ \ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\ \ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\ \ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\ \ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\ \ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\ \ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\ \ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\ \ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\ \ \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n\ \ \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\ \ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \ \ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\ \ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\ \ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7701130353030725,\n\ \ \"mc2_stderr\": 0.01431543014964792\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.01032971283278572\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \ \ \"acc_stderr\": 0.012714401009923649\n }\n}\n```" repo_url: https://huggingface.co/touqir/Cyrax-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|arc:challenge|25_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-14T23-34-58.336806.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|gsm8k|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hellaswag|10_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|truthfulqa:mc|0_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-14T23-34-58.336806.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_14T23_34_58.336806 path: - '**/details_harness|winogrande|5_2024-02-14T23-34-58.336806.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-14T23-34-58.336806.parquet' - config_name: results data_files: - split: 2024_02_14T23_34_58.336806 path: - results_2024-02-14T23-34-58.336806.parquet - split: latest path: - results_2024-02-14T23-34-58.336806.parquet --- # Dataset Card for Evaluation run of touqir/Cyrax-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [touqir/Cyrax-7B](https://huggingface.co/touqir/Cyrax-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_touqir__Cyrax-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-14T23:34:58.336806](https://huggingface.co/datasets/open-llm-leaderboard/details_touqir__Cyrax-7B/blob/main/results_2024-02-14T23-34-58.336806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6527021919239829, "acc_stderr": 0.03198108902065119, "acc_norm": 0.6514575047354266, "acc_norm_stderr": 0.032651586646568864, "mc1": 0.6303549571603427, "mc1_stderr": 0.01689818070697388, "mc2": 0.7701130353030725, "mc2_stderr": 0.01431543014964792 }, "harness|arc:challenge|25": { "acc": 0.7133105802047781, "acc_stderr": 0.013214986329274777, "acc_norm": 0.7295221843003413, "acc_norm_stderr": 0.012980954547659556 }, "harness|hellaswag|10": { "acc": 0.7457677753435571, "acc_stderr": 0.004345388614520019, "acc_norm": 0.8818960366460864, "acc_norm_stderr": 0.003220716126685038 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.02533120243894443, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.02533120243894443 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258172, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258172 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4446927374301676, "acc_stderr": 0.01661988198817702, "acc_norm": 0.4446927374301676, "acc_norm_stderr": 0.01661988198817702 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083136, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083136 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146292, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146292 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.019047485239360378, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.019047485239360378 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699121, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.6303549571603427, "mc1_stderr": 0.01689818070697388, "mc2": 0.7701130353030725, "mc2_stderr": 0.01431543014964792 }, "harness|winogrande|5": { "acc": 0.8389897395422258, "acc_stderr": 0.01032971283278572 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923649 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
irds/mr-tydi_th_dev
--- pretty_name: '`mr-tydi/th/dev`' viewer: false source_datasets: ['irds/mr-tydi_th'] task_categories: - text-retrieval --- # Dataset Card for `mr-tydi/th/dev` The `mr-tydi/th/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/th/dev). # Data This dataset provides: - `queries` (i.e., topics); count=807 - `qrels`: (relevance assessments); count=817 - For `docs`, use [`irds/mr-tydi_th`](https://huggingface.co/datasets/irds/mr-tydi_th) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/mr-tydi_th_dev', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/mr-tydi_th_dev', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @article{Zhang2021MrTyDi, title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval}, author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin}, year={2021}, journal={arXiv:2108.08787}, } @article{Clark2020TyDiQa, title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages}, author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}, year={2020}, journal={Transactions of the Association for Computational Linguistics} } ```
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_2_1000
--- dataset_info: features: - name: id dtype: int64 - name: response dtype: string splits: - name: train num_bytes: 959 num_examples: 32 download_size: 2006 dataset_size: 959 --- # Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_2_1000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sam1120/dropoff-utcustom-TEST
--- dataset_info: features: - name: name dtype: string - name: pixel_values dtype: image - name: labels dtype: image splits: - name: train num_bytes: 286143613.0 num_examples: 101 download_size: 86640225 dataset_size: 286143613.0 --- # Dataset Card for "dropoff-utcustom-TEST" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
arieg/bw_spec_cls_100_00_large_200
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': '10' '1': '1039' '2': '1040' '3': '1082' '4': '1083' '5': '1102' '6': '1193' '7': '1195' '8': '1196' '9': '1197' '10': '1270' '11': '1276' '12': '1277' '13': '1278' '14': '140' '15': '141' '16': '1417' '17': '1427' '18': '1443' '19': '1482' '20': '1510' '21': '1544' '22': '1642' '23': '1644' '24': '1649' '25': '1661' '26': '1663' '27': '1666' '28': '1673' '29': '1680' '30': '1681' '31': '1682' '32': '1683' '33': '1684' '34': '1685' '35': '1686' '36': '1687' '37': '1688' '38': '1689' '39': '1701' '40': '1702' '41': '1703' '42': '1704' '43': '1706' '44': '1720' '45': '1732' '46': '1733' '47': '1735' '48': '1736' '49': '1883' '50': '1891' '51': '190' '52': '1924' '53': '1925' '54': '1929' '55': '193' '56': '1930' '57': '194' '58': '197' '59': '2' '60': '200' '61': '203' '62': '204' '63': '207' '64': '210' '65': '211' '66': '212' '67': '213' '68': '255' '69': '256' '70': '368' '71': '424' '72': '5' '73': '534' '74': '540' '75': '546' '76': '574' '77': '615' '78': '620' '79': '621' '80': '625' '81': '666' '82': '667' '83': '676' '84': '694' '85': '695' '86': '714' '87': '715' '88': '716' '89': '718' '90': '777' '91': '814' '92': '821' '93': '822' '94': '825' '95': '853' '96': '897' '97': '995' '98': '997' '99': '998' splits: - name: train num_bytes: 1082075957.978 num_examples: 19993 - name: test num_bytes: 112361928.0 num_examples: 2000 download_size: 1214816598 dataset_size: 1194437885.978 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
tfshaman/metamath_sympy_v1
--- dataset_info: features: - name: output dtype: string - name: answer dtype: string - name: question dtype: string - name: code_output dtype: float64 - name: data_type dtype: string splits: - name: train num_bytes: 446858993 num_examples: 185804 download_size: 155705297 dataset_size: 446858993 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "metamath_sympy_v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-eval-samsum-samsum-a5c306-1520055006
--- type: predictions tags: - autotrain - evaluation datasets: - samsum eval_info: task: summarization model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2 metrics: [] dataset_name: samsum dataset_config: samsum dataset_split: test col_mapping: text: dialogue target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2 * Dataset: samsum * Config: samsum * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model.
yuyijiong/Multi-doc-QA-CommonCrawl
--- license: cc-by-nc-4.0 size_categories: - 10K<n<100K language: - en --- * Update on December 24, 2023: Improve the format of answers: force all answers to be provide the refered original text first. * English multi document Q&A data created using RedPajamaCommonCrawl data as reference text * In the Raw dataset, each sample contains <font color=red> one reference document, 199 irrelevant documents, and a Q-A pair based on the reference document</font>. It can be used to train models to extract the target information from a large number of documents. * After filtering, integrating, and transforming the raw data into chatml format instruction fine-tuning data, each sample contains approximately 30 reference documents and 5 corresponding QA pairs. <br/> * 2023.12.4更新:改进答案的格式,强制所有答案在回答时必须先给出原文。 * 以RedPajamaCommonCrawl数据为参考文本,制作的英文多文档问答数据 * 原始数据中,每个样本包含 <font color=red> 一个参考文档、199个无关文档、一个基于参考文档的问答对</font>。可以训练模型从大量文档中抽取关键信息的能力。 * 原始数据经过筛选、整合转化为chatml形式的指令微调数据后,每条数据大约包含30个参考文档,以及5个对应的问答对。 * dataset size: 11k
nc33/multiSpanQa_exp
--- license: mit ---
winwithaman/Brain-ich-dataset
--- license: apache-2.0 ---
liuyanchen1015/MULTI_VALUE_mnli_past_been
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 609213 num_examples: 2984 - name: dev_mismatched num_bytes: 673153 num_examples: 3048 - name: test_matched num_bytes: 625803 num_examples: 2973 - name: test_mismatched num_bytes: 655878 num_examples: 2973 - name: train num_bytes: 24416918 num_examples: 115853 download_size: 17199356 dataset_size: 26980965 --- # Dataset Card for "MULTI_VALUE_mnli_past_been" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davidfant/rapidapi-example-responses-tokenized-xlm-roberta
--- dataset_info: features: - name: id dtype: string - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: category dtype: string splits: - name: train num_bytes: 166566222.06213877 num_examples: 43755 - name: test num_bytes: 18508626.93786124 num_examples: 4862 download_size: 62641988 dataset_size: 185074849.0 --- # Dataset Card for "rapidapi-example-responses-tokenized-xlm-roberta" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AlekseyKorshuk/cup-it-ds-ranked-small
--- dataset_info: features: - name: prompt dtype: string - name: '0' dtype: string - name: '1' dtype: string - name: '2' dtype: string - name: '3' dtype: string - name: '4' dtype: string splits: - name: train num_bytes: 10582267 num_examples: 3965 - name: validation num_bytes: 1157743 num_examples: 441 download_size: 7786012 dataset_size: 11740010 --- # Dataset Card for "cup-it-ds-ranked-small" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MindonFire/PythonGeneralQuery
--- dataset_info: features: - name: Prompts dtype: string splits: - name: train num_bytes: 7156678 num_examples: 13109 download_size: 2180901 dataset_size: 7156678 configs: - config_name: default data_files: - split: train path: data/train-* ---
huggingface/autotrain-data-8xid-g6rv-rkuc4
Invalid username or password.
BangumiBase/azumangadaioh
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Azumanga Daioh This is the image base of bangumi Azumanga Daioh, we detected 14 characters, 3047 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 76 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 83 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 607 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 311 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 233 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 502 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 478 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 151 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 31 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 500 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 20 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 7 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | N/A | | 12 | 10 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | noise | 38 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
Gummybear05/EY
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: audio struct: - name: array sequence: float64 - name: path dtype: string - name: sample_rate dtype: int64 - name: text dtype: string - name: scriptId dtype: int64 - name: fileNm dtype: string - name: recrdTime dtype: float64 - name: recrdQuality dtype: int64 - name: recrdDt dtype: string - name: scriptSetNo dtype: string - name: recrdEnvrn dtype: string - name: colctUnitCode dtype: string - name: cityCode dtype: string - name: recrdUnit dtype: string - name: convrsThema dtype: string - name: gender dtype: string - name: recorderId dtype: string - name: age dtype: int64 splits: - name: train num_bytes: 4505684820 num_examples: 5400 download_size: 1029740456 dataset_size: 4505684820 --- # Dataset Card for "EY" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/king_george_v_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of king_george_v/キングジョージ5世/英王乔治五世 (Azur Lane) This is the dataset of king_george_v/キングジョージ5世/英王乔治五世 (Azur Lane), containing 74 images and their tags. The core tags of this character are `blonde_hair, long_hair, breasts, red_eyes, braid, large_breasts, ribbon, hair_ribbon, bangs, blue_ribbon, hair_between_eyes, side_braid`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 74 | 103.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/king_george_v_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 74 | 57.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/king_george_v_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 164 | 115.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/king_george_v_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 74 | 91.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/king_george_v_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 164 | 163.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/king_george_v_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/king_george_v_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 26 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, epaulettes, black_thighhighs, ribbon_braid, cape, garter_straps, pleated_skirt, jacket, fur_trim, single_braid, saber_(weapon), smile, uniform, medal, simple_background, white_skirt, holding, sheath | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1boy, 1girl, blush, hetero, nipples, penis, sweat, vaginal, epaulettes, mosaic_censoring, sex_from_behind, solo_focus, dark-skinned_male, long_sleeves, navel, open_mouth, red_jacket, spread_legs, standing_sex, thighhighs, thighs, very_long_hair, breasts_out, clenched_teeth, clothed_female_nude_male, cum_in_pussy, interracial, single_braid, testicles | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | looking_at_viewer, 1girl, closed_mouth, navel, smile, solo, cleavage, standing, bikini, collarbone, full_body, simple_background, white_background, arm_up, black_footwear, blush, crossed_bangs, high_heels, stomach | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | looking_at_viewer, 1girl, smile, bare_shoulders, cleavage, red_dress, blush, choker, solo, black_ribbon, collarbone, frills, open_mouth, closed_mouth, french_braid, standing, strapless_dress | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | epaulettes | black_thighhighs | ribbon_braid | cape | garter_straps | pleated_skirt | jacket | fur_trim | single_braid | saber_(weapon) | smile | uniform | medal | simple_background | white_skirt | holding | sheath | 1boy | blush | hetero | nipples | penis | sweat | vaginal | mosaic_censoring | sex_from_behind | solo_focus | dark-skinned_male | long_sleeves | navel | open_mouth | red_jacket | spread_legs | standing_sex | thighhighs | thighs | very_long_hair | breasts_out | clenched_teeth | clothed_female_nude_male | cum_in_pussy | interracial | testicles | closed_mouth | cleavage | standing | bikini | collarbone | full_body | white_background | arm_up | black_footwear | crossed_bangs | high_heels | stomach | bare_shoulders | red_dress | choker | black_ribbon | frills | french_braid | strapless_dress | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:-------------------|:---------------|:-------|:----------------|:----------------|:---------|:-----------|:---------------|:-----------------|:--------|:----------|:--------|:--------------------|:--------------|:----------|:---------|:-------|:--------|:---------|:----------|:--------|:--------|:----------|:-------------------|:------------------|:-------------|:--------------------|:---------------|:--------|:-------------|:-------------|:--------------|:---------------|:-------------|:---------|:-----------------|:--------------|:-----------------|:---------------------------|:---------------|:--------------|:------------|:---------------|:-----------|:-----------|:---------|:-------------|:------------|:-------------------|:---------|:-----------------|:----------------|:-------------|:----------|:-----------------|:------------|:---------|:---------------|:---------|:---------------|:------------------| | 0 | 26 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | | | | | | | | | | X | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 3 | 12 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | | X | | | | | | | | X | X | X | X | X | X | X |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c152e7f5
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 184 num_examples: 10 download_size: 1340 dataset_size: 184 --- # Dataset Card for "c152e7f5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
freshpearYoon/train_free_12
--- dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 9604671088 num_examples: 10000 download_size: 1349410105 dataset_size: 9604671088 configs: - config_name: default data_files: - split: train path: data/train-* ---
aureliojafer/twitter_dataset_1709851437
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 splits: - name: train num_bytes: 60602 num_examples: 200 download_size: 39958 dataset_size: 60602 configs: - config_name: default data_files: - split: train path: data/train-* ---
scholarly360/contracts-classification-instruction-llm-experiments
--- dataset_info: features: - name: id dtype: string - name: instruction dtype: string - name: instances list: - name: input dtype: string - name: output dtype: string - name: is_classification dtype: bool splits: - name: train num_bytes: 2227831 num_examples: 6052 - name: test num_bytes: 949692 num_examples: 2600 download_size: 1491275 dataset_size: 3177523 --- # Dataset Card for "contracts-classification-instruction-llm-experiments" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
team6/roast-history
--- license: mit ---
Howuhh/xland_minigrid
--- license: apache-2.0 ---
CyberHarem/akira_makino_onichichi
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Akira Makino This is the dataset of Akira Makino, containing 198 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------| | raw | 198 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 420 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | raw-stage3-eyes | 511 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. | | 384x512 | 198 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x704 | 198 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x880 | 198 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 420 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 420 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-p512-640 | 390 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. | | stage3-eyes-640 | 511 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. | | stage3-eyes-800 | 511 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
safiyaalavi/biored_tokenized_new
--- dataset_info: features: - name: pmid dtype: string - name: passage dtype: string - name: tokens sequence: string - name: ner_tags sequence: int64 splits: - name: test num_bytes: 184274 num_examples: 30 - name: train num_bytes: 865185 num_examples: 148 - name: val num_bytes: 197171 num_examples: 33 download_size: 393173 dataset_size: 1246630 --- # Dataset Card for "biored_tokenized_new" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
reciprocate/tinygsm_mixtral_2M
--- dataset_info: features: - name: question dtype: string - name: program dtype: string - name: result dtype: string - name: conversations list: - name: from dtype: string - name: value dtype: string splits: - name: train num_bytes: 2666401169 num_examples: 2000000 download_size: 765690660 dataset_size: 2666401169 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/amagi_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of amagi/天城/天城 (Azur Lane) This is the dataset of amagi/天城/天城 (Azur Lane), containing 500 images and their tags. The core tags of this character are `animal_ears, brown_hair, long_hair, fox_ears, breasts, purple_eyes, bangs, large_breasts, tail, animal_ear_fluff, thick_eyebrows, fox_girl, fox_tail, multiple_tails, blunt_bangs, hair_ornament`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amagi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 485.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amagi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1281 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amagi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 860.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amagi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1281 | 1.64 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amagi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/amagi_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 36 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, white_bikini, bare_shoulders, cleavage, looking_at_viewer, see-through, very_long_hair, choker, official_alternate_costume, thigh_strap, sash, thighs, sitting, kitsune | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, holding_umbrella, oil-paper_umbrella, wide_sleeves, looking_at_viewer, red_coat, solo, purple_kimono, kitsune, petals, cherry_blossoms, sakuramon, smile, brown_gloves | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, looking_at_viewer, off_shoulder, shimenawa, solo, twintails, wide_sleeves, black_pantyhose, kyuubi, sidelocks, long_sleeves, red_kimono, detached_sleeves, bell, cherry_blossoms, ribbon, wooden_floor, holding, seiza, simple_background | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, cleavage, solo, looking_at_viewer, collarbone, off_shoulder, red_kimono, wide_sleeves, brown_tail, sitting, very_long_hair | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, cleavage, kimono, red_eyes, red_skirt, smile, solo, wide_sleeves, black_gloves, black_thighhighs, outdoors, pleated_skirt, full_body, standing, black_hair, brown_tail, cherry_blossoms, cloud, holding, kyuubi, looking_at_viewer, makeup, parted_lips, sakuramon, sidelocks, sky | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | steam, 1girl, cleavage, collarbone, solo, bare_shoulders, naked_towel, looking_at_viewer | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, cleavage, folding_fan, holding_fan, looking_at_viewer, thighs, black_dress, indoors, solo, bare_shoulders, feather_boa, kitsune, official_alternate_costume, window, barefoot, nail_polish, on_couch, on_side, very_long_hair | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, hetero, solo_focus, blush, nipples, open_mouth, ejaculation, heart, navel, 1boy, completely_nude, cum_in_pussy, multiple_boys, multiple_penises, overflow, sweat, uncensored, ahegao, choker, collarbone, dark-skinned_male, gangbang, handjob, huge_breasts, lactation, mosaic_censoring, sky, spread_legs, vaginal | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_bikini | bare_shoulders | cleavage | looking_at_viewer | see-through | very_long_hair | choker | official_alternate_costume | thigh_strap | sash | thighs | sitting | kitsune | holding_umbrella | oil-paper_umbrella | wide_sleeves | red_coat | purple_kimono | petals | cherry_blossoms | sakuramon | smile | brown_gloves | off_shoulder | shimenawa | twintails | black_pantyhose | kyuubi | sidelocks | long_sleeves | red_kimono | detached_sleeves | bell | ribbon | wooden_floor | holding | seiza | simple_background | collarbone | brown_tail | kimono | red_eyes | red_skirt | black_gloves | black_thighhighs | outdoors | pleated_skirt | full_body | standing | black_hair | cloud | makeup | parted_lips | sky | steam | naked_towel | folding_fan | holding_fan | black_dress | indoors | feather_boa | window | barefoot | nail_polish | on_couch | on_side | hetero | solo_focus | blush | nipples | open_mouth | ejaculation | heart | navel | 1boy | completely_nude | cum_in_pussy | multiple_boys | multiple_penises | overflow | sweat | uncensored | ahegao | dark-skinned_male | gangbang | handjob | huge_breasts | lactation | mosaic_censoring | spread_legs | vaginal | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:-----------------|:-----------|:--------------------|:--------------|:-----------------|:---------|:-----------------------------|:--------------|:-------|:---------|:----------|:----------|:-------------------|:---------------------|:---------------|:-----------|:----------------|:---------|:------------------|:------------|:--------|:---------------|:---------------|:------------|:------------|:------------------|:---------|:------------|:---------------|:-------------|:-------------------|:-------|:---------|:---------------|:----------|:--------|:--------------------|:-------------|:-------------|:---------|:-----------|:------------|:---------------|:-------------------|:-----------|:----------------|:------------|:-----------|:-------------|:--------|:---------|:--------------|:------|:--------|:--------------|:--------------|:--------------|:--------------|:----------|:--------------|:---------|:-----------|:--------------|:-----------|:----------|:---------|:-------------|:--------|:----------|:-------------|:--------------|:--------|:--------|:-------|:------------------|:---------------|:----------------|:-------------------|:-----------|:--------|:-------------|:---------|:--------------------|:-----------|:----------|:---------------|:------------|:-------------------|:--------------|:----------| | 0 | 36 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | | | | | | | | | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | X | | X | | | | | | X | | | | X | | | | | | | | X | | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | X | X | | | | | | | | | | | | X | | | | X | X | X | | | | | | X | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | X | X | X | | X | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
NilanE/SmallParallelDocs-Ja_En-6k
--- license: apache-2.0 task_categories: - translation language: - ja - en --- This dataset contains document-length Japanese-English parallel texts from various sources. The intended use case is for translation tasks. The cumulative dataset is under an apache 2.0 license, but the sources differ on that account. If there are conflicts, assume the more restrictive clause takes priority (?) I am unfamiliar with licenses in general, so if someone sees any issues with the licensing situation, please let me know. # Metadata meaning: source: self-explanatory ### specific to manual fanfic: missed_lines: number of lines in the document where the index matched between the unaligned source and translation, but the content of the lines did not meet the match threshold. This number being high relative to the number of lines in the document indicates either a poor translation, or some other factor causing the source and translation to differ beyond the norm. inserted_lines_src: number of lines that only appear in the source document, and matches are found surrounding them, indicating the line is some kind of insertion (like a note, afterword, title, etc). Is generally safe to ignore unless it's significant compared to the number of correctly aligned sentences, which would indicate a poor matchup between source and translation sources, for one reason or another. inserted_lines_src: same as above, but for the target document instead # Dataset Sources news_commentary (unknown license?) iwslt2017 (cc-by-nc-nd-4.0) https://www2.nict.go.jp/astrec-att/member/mutiyama/ (gpl v1.2) manually scraped fanfiction and translations (apache 2.0)
Shoubhik8/instruct_data_valid
--- dataset_info: features: - name: instructions dtype: string - name: output dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 100019840 num_examples: 100339 download_size: 4396637 dataset_size: 100019840 --- # Dataset Card for "instruct_data_valid" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NobodyExistsOnTheInternet/ConvoEvolLIMAuncensored
--- license: mit ---
kdercksen/nlpeer_arr_qa
--- dataset_info: features: - name: text dtype: string - name: question dtype: string - name: label dtype: string splits: - name: train num_bytes: 12292871 num_examples: 5472 download_size: 1450298 dataset_size: 12292871 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/handa_roco_theidolmstermillionlive
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of handa_roco/伴田路子 (THE iDOLM@STER: Million Live!) This is the dataset of handa_roco/伴田路子 (THE iDOLM@STER: Million Live!), containing 183 images and their tags. The core tags of this character are `long_hair, bow, yellow_eyes, hair_bow, bangs, twintails, green_eyes, grey_hair, parted_bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 183 | 191.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/handa_roco_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 183 | 125.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/handa_roco_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 392 | 243.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/handa_roco_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 183 | 174.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/handa_roco_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 392 | 318.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/handa_roco_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/handa_roco_theidolmstermillionlive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_bow, blush, headphones_around_neck, polka_dot_bow, black_skirt, bracelet, solo, long_sleeves, looking_at_viewer, open_mouth, blue_pantyhose, smile, very_long_hair, shirt, simple_background, light_brown_hair, low_twintails, pleated_skirt, white_jacket, wrist_scrunchie | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, :d, open_mouth, solo, dress, hairclip, looking_at_viewer, microphone_stand, mini_hat, navel, necklace, necktie, star_(symbol), top_hat, wrist_cuffs | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, looking_at_viewer, small_breasts, blush, nipples, pussy, :d, nude, open_mouth, female_pubic_hair, navel | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, looking_at_viewer, solo, very_long_hair, :d, bare_shoulders, open_mouth, polka_dot, white_dress, collarbone, day, hair_flower, sleeveless_dress, white_flower, blue_sky, breasts, cloud, dated, frills, holding_bouquet, light_brown_hair, mini_crown, outdoors, pearl_necklace, strapless_dress | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, cleavage, collarbone, looking_at_viewer, navel, solo, earrings, side_ponytail, white_bikini, bare_shoulders, medium_breasts, open_mouth, bracelet, small_breasts, very_long_hair, bikini_skirt, blue_scrunchie, brown_eyes, frilled_bikini, hair_scrunchie, outdoors, see-through, sitting, sky, water | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bow | blush | headphones_around_neck | polka_dot_bow | black_skirt | bracelet | solo | long_sleeves | looking_at_viewer | open_mouth | blue_pantyhose | smile | very_long_hair | shirt | simple_background | light_brown_hair | low_twintails | pleated_skirt | white_jacket | wrist_scrunchie | :d | dress | hairclip | microphone_stand | mini_hat | navel | necklace | necktie | star_(symbol) | top_hat | wrist_cuffs | small_breasts | nipples | pussy | nude | female_pubic_hair | bare_shoulders | polka_dot | white_dress | collarbone | day | hair_flower | sleeveless_dress | white_flower | blue_sky | breasts | cloud | dated | frills | holding_bouquet | mini_crown | outdoors | pearl_necklace | strapless_dress | cleavage | earrings | side_ponytail | white_bikini | medium_breasts | bikini_skirt | blue_scrunchie | brown_eyes | frilled_bikini | hair_scrunchie | see-through | sitting | sky | water | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------|:-------------------------|:----------------|:--------------|:-----------|:-------|:---------------|:--------------------|:-------------|:-----------------|:--------|:-----------------|:--------|:--------------------|:-------------------|:----------------|:----------------|:---------------|:------------------|:-----|:--------|:-----------|:-------------------|:-----------|:--------|:-----------|:----------|:----------------|:----------|:--------------|:----------------|:----------|:--------|:-------|:--------------------|:-----------------|:------------|:--------------|:-------------|:------|:--------------|:-------------------|:---------------|:-----------|:----------|:--------|:--------|:---------|:------------------|:-------------|:-----------|:-----------------|:------------------|:-----------|:-----------|:----------------|:---------------|:-----------------|:---------------|:-----------------|:-------------|:-----------------|:-----------------|:--------------|:----------|:------|:--------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | | | | X | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | | | X | | X | X | | | | | | | | | | | X | | | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | | | X | | X | X | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | X | X | | X | X | | | X | | | | | | | | | | | | | X | | | | | | X | | | | | X | | | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
carles-undergrad-thesis/en-id-parallel-sentences-embedding-vbert
--- dataset_info: features: - name: text_en dtype: string - name: text_id dtype: string - name: target_embedding sequence: float32 - name: input_ids_en sequence: int64 - name: attention_mask_en sequence: int64 - name: token_type_ids_en sequence: int64 - name: input_ids_id sequence: int64 - name: attention_mask_id sequence: int64 - name: token_type_ids_id sequence: int64 splits: - name: train num_bytes: 15780096944 num_examples: 1000000 download_size: 4104220800 dataset_size: 15780096944 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3
--- pretty_name: Evaluation run of v1olet/v1olet_merged_dpo_7B_v3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [v1olet/v1olet_merged_dpo_7B_v3](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-13T14:24:48.868397](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3/blob/main/results_2023-12-13T14-24-48.868397.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6411740347675706,\n\ \ \"acc_stderr\": 0.03228342039008203,\n \"acc_norm\": 0.6407691161331389,\n\ \ \"acc_norm_stderr\": 0.03295002376578124,\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n\ \ \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n\ \ \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989503\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n\ \ \"acc_stderr\": 0.004507768029590101,\n \"acc_norm\": 0.8770165305715992,\n\ \ \"acc_norm_stderr\": 0.0032774703870227257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\ \ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\ \ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\ : 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n\ \ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\ \ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\ \ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\ \ 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\"\ : 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n\ \ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \ \ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"\ acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\ acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\ \ \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n\ \ \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\ \ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\ \ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \ \ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \ \ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\ acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n\ \ \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \ \ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\ \ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\ \ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\ \ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\ \ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n\ \ \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n\ \ \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\ \ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\ \ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\ \ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\ \ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\ \ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\ \ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n\ \ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \ \ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n\ \ \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918742\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66868840030326,\n \ \ \"acc_stderr\": 0.01296499967968867\n }\n}\n```" repo_url: https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|arc:challenge|25_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|arc:challenge|25_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-13T14-24-48.868397.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|gsm8k|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|gsm8k|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hellaswag|10_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hellaswag|10_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-16-48.443238.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|truthfulqa:mc|0_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|truthfulqa:mc|0_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-13T14-24-48.868397.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_13T14_16_48.443238 path: - '**/details_harness|winogrande|5_2023-12-13T14-16-48.443238.parquet' - split: 2023_12_13T14_24_48.868397 path: - '**/details_harness|winogrande|5_2023-12-13T14-24-48.868397.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-13T14-24-48.868397.parquet' - config_name: results data_files: - split: 2023_12_13T14_16_48.443238 path: - results_2023-12-13T14-16-48.443238.parquet - split: 2023_12_13T14_24_48.868397 path: - results_2023-12-13T14-24-48.868397.parquet - split: latest path: - results_2023-12-13T14-24-48.868397.parquet --- # Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v3](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-13T14:24:48.868397](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3/blob/main/results_2023-12-13T14-24-48.868397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6411740347675706, "acc_stderr": 0.03228342039008203, "acc_norm": 0.6407691161331389, "acc_norm_stderr": 0.03295002376578124, "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.6907171691355769, "mc2_stderr": 0.015243695704371275 }, "harness|arc:challenge|25": { "acc": 0.7030716723549488, "acc_stderr": 0.013352025976725223, "acc_norm": 0.7261092150170648, "acc_norm_stderr": 0.013032004972989503 }, "harness|hellaswag|10": { "acc": 0.7143995220075682, "acc_stderr": 0.004507768029590101, "acc_norm": 0.8770165305715992, "acc_norm_stderr": 0.0032774703870227257 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7291666666666666, "acc_stderr": 0.03716177437566017, "acc_norm": 0.7291666666666666, "acc_norm_stderr": 0.03716177437566017 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.04451807959055328, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.04451807959055328 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554956, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554956 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919436, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.02432173848460235, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.02432173848460235 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976037, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976037 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.02704462171947408, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.02704462171947408 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467618, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.0306365913486998, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.0306365913486998 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.037683359597287434, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.037683359597287434 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615623, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.024257901705323378, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.024257901705323378 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4547486033519553, "acc_stderr": 0.016653875777524006, "acc_norm": 0.4547486033519553, "acc_norm_stderr": 0.016653875777524006 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.026173908506718576, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.026173908506718576 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399662, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399662 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873862, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873862 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.0286619962023353, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.0286619962023353 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6552287581699346, "acc_stderr": 0.019228322018696644, "acc_norm": 0.6552287581699346, "acc_norm_stderr": 0.019228322018696644 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827075, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827075 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.6907171691355769, "mc2_stderr": 0.015243695704371275 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918742 }, "harness|gsm8k|5": { "acc": 0.66868840030326, "acc_stderr": 0.01296499967968867 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
polinaeterna/push
--- dataset_info: - config_name: custom features: - name: x dtype: int64 - name: y dtype: int64 splits: - name: train num_bytes: 3200 num_examples: 200 - name: test num_bytes: 4800 num_examples: 300 download_size: 7682 dataset_size: 8000 - config_name: default features: - name: x dtype: int64 - name: y dtype: int64 splits: - name: train num_bytes: 1600 num_examples: 100 - name: test num_bytes: 3200 num_examples: 200 download_size: 5578 dataset_size: 4800 configs_kwargs: - config_name: custom data_dir: custom - config_name: default data_dir: ./ --- # Dataset Card for "push" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
andidu/paraphrase-ru-reviews
--- language: - ru pretty_name: andidu/paraphrase-ru-reviews size_categories: - 100K<n<1M ---