datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AhmedBou/Arabic_Quotes | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
language:
- ar
size_categories:
- 1K<n<10K
---
# Arabic Quotes Dataset




## Overview
The **Arabic Quotes Dataset** is an open-source collection of 5900+ quotes in the Arabic language, accompanied by up to three tags for each quote.
The dataset is suitable for various Natural Language Processing (NLP) tasks, such as text classification and tagging.
## Data Description
- Contains 5900+ quotes with up to three associated tags per quote.
- All quotes and tags are in Arabic.
## Use Cases
- Text Classification: Classify quotes into predefined categories.
- Tagging: Assign relevant labels or themes to quotes.
- Sentiment Analysis: Analyze sentiment expressed in quotes.
- Language Modeling: Train models to generate Arabic quotes.
- Information Retrieval: Retrieve quotes relevant to specific topics.
## License
The "Arabic Quotes" dataset is distributed under the Apache License 2.0. Feel free to use it for any purpose, giving appropriate credit to the original source.
**Github Repository:** https://github.com/BoulahiaAhmed/Arabic-Quotes-Dataset
## Data Format
The dataset is available in CSV format. Each row represents a quote with its associated tags. Example structure:
```
quote,tags
"أنا لا أبالي برأي الناس، أنا لست عبدًا لتقييماتهم.","[حرية, تحفيز, قوة]"
"الصمت هو أكبر إجابة.", "[سكوت, حكمة]"
...
```
--- |
plbarrios/dataset | ---
license: mit
---
|
alasdevcenter/azspeech | ---
Dataset Name: azspeech
Creator: Alas Development Center
Version: 1.0
Size: 1000+ hours, 400k+ voice files
Language: Azerbaijani
---
AzSpeech is a comprehensive voice dataset curated by the Alas Development Center, consisting of over 1000 hours of diverse voice recordings, totaling more than 400,000 individual voice files. This extensive collection has been meticulously compiled from various sources across the internet, ensuring a broad representation of linguistic nuances.
The dataset aims to facilitate advancements in voice recognition technology, natural language processing, and machine learning research, offering a rich resource for developers, researchers, and organizations working in these fields.
### Availability
Out of the extensive AzSpeech collection, 4k samples from the 400k available have been made accessible for review purposes. This initiative aims to provide a glimpse into the quality and diversity of the dataset, supporting the community's engagement with our voice data. Interested parties are encouraged to contact the Alas Development Center for access to the dataset and further collaboration.
### Usage and Acknowledgments
__Commercial Use:__
Organizations interested in utilizing the AzSpeech dataset for commercial purposes are encouraged to get in touch with us. We offer access to the complete dataset on a paid basis. This approach enables organizations to explore the full extent of our dataset, tailored to meet the diverse needs of voice recognition technology, natural language processing, and machine learning applications.
__Academic and Research Use:__
Approximately 40% of the AzSpeech dataset (~400 hours) is designated for open-source use, aimed at supporting academic and research endeavors. Educational institutions wishing to access this portion of the dataset are required to form a partnership with the Alas Development Center. It is important to note that we will not be processing individual requests. Instead, our focus is on establishing collaborations with organizations that share our commitment to ethical data use. Organizations accessing the open-source data must fully comprehend and agree to our guidelines on data misuse prevention and adhere to our monitoring policy. This ensures the dataset's responsible use and aligns with our goals of advancing the field of voice technology research and development.
For educational institutions and research organizations interested in accessing the open-source portion of the AzSpeech dataset, please fill out the [following](https://forms.gle/xR11bACKfiERVAti7) form using your official company or institutional email. This process is designed to ensure that access is granted to legitimate academic and research entities committed to ethical and responsible use of the dataset.
### Collection and Pre-processing
In the collection process for the AzSpeech dataset, all voice recordings have been sourced exclusively from public domains. Throughout this meticulous process, the Alas Development Center has adhered to international laws and regulations concerning data privacy, intellectual property rights, and ethical use of digital content. This adherence ensures that the dataset complies with global standards, safeguarding the interests of individuals and entities involved while fostering innovation and research in voice technology.
Recognizing the importance of data quality for effective model training and research, we have undertaken a comprehensive preprocessing and denoising procedure to ensure the dataset provides ready data for users. This means the data is ready for immediate use in a range of applications, from fine-tuning text-to-speech and automatic speech recognition models to academic research.
Quality Assurance: Each voice file has undergone rigorous quality checks to ensure clarity and usability. This includes verifying the audio quality and ensuring the spoken content matches associated transcriptions.
Denoising: With advanced audio processing techniques, background noise has been significantly reduced in each recording. This denoising process enhances the purity of the voice data, making it more effective for training models that can distinguish nuanced vocal features.
Normalization: Audio files have been normalized to maintain consistent volume levels across the dataset. This standardization is crucial for avoiding bias towards louder or quieter recordings during model training.
### Contact Information
For access to the AzSpeech dataset, partnership inquiries, or any other questions, please contact the [Alas Development Center](https://alasdevcenter.com/contact) or or write to us on [Linkedin](https://www.linkedin.com/company/alas-development-center). |
irds/beir_climate-fever | ---
pretty_name: '`beir/climate-fever`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/climate-fever`
The `beir/climate-fever` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/climate-fever).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=5,416,593
- `queries` (i.e., topics); count=1,535
- `qrels`: (relevance assessments); count=4,681
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_climate-fever', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'title': ...}
queries = load_dataset('irds/beir_climate-fever', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_climate-fever', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Diggelmann2020CLIMATEFEVERAD,
title={CLIMATE-FEVER: A Dataset for Verification of Real-World Climate Claims},
author={T. Diggelmann and Jordan L. Boyd-Graber and Jannis Bulian and Massimiliano Ciaramita and Markus Leippold},
journal={ArXiv},
year={2020},
volume={abs/2012.00614}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T09:33:15.555085](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-22T09-33-15.555085.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34186241610738255,\n\
\ \"em_stderr\": 0.004857621548300327,\n \"f1\": 0.41979970637584074,\n\
\ \"f1_stderr\": 0.0046695180305142536,\n \"acc\": 0.3822126733737321,\n\
\ \"acc_stderr\": 0.007348726082467704\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.34186241610738255,\n \"em_stderr\": 0.004857621548300327,\n\
\ \"f1\": 0.41979970637584074,\n \"f1_stderr\": 0.0046695180305142536\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.002615326510775672\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T09_33_15.555085
path:
- '**/details_harness|drop|3_2023-10-22T09-33-15.555085.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T09-33-15.555085.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T09_33_15.555085
path:
- '**/details_harness|gsm8k|5_2023-10-22T09-33-15.555085.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T09-33-15.555085.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T09_33_15.555085
path:
- '**/details_harness|winogrande|5_2023-10-22T09-33-15.555085.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T09-33-15.555085.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- results_2023-08-28T22:45:44.482040.parquet
- split: 2023_10_22T09_33_15.555085
path:
- results_2023-10-22T09-33-15.555085.parquet
- split: latest
path:
- results_2023-10-22T09-33-15.555085.parquet
---
# Dataset Card for Evaluation run of TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T09:33:15.555085](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-22T09-33-15.555085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34186241610738255,
"em_stderr": 0.004857621548300327,
"f1": 0.41979970637584074,
"f1_stderr": 0.0046695180305142536,
"acc": 0.3822126733737321,
"acc_stderr": 0.007348726082467704
},
"harness|drop|3": {
"em": 0.34186241610738255,
"em_stderr": 0.004857621548300327,
"f1": 0.41979970637584074,
"f1_stderr": 0.0046695180305142536
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775672
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
scherrmann/financial_phrasebank_75agree_german | ---
language:
- de
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
pretty_name: FinancialPhrasebankGerman
tags:
- finance
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 422345
num_examples: 2763
- name: validation
num_bytes: 51710
num_examples: 344
- name: test
num_bytes: 55109
num_examples: 346
download_size: 318382
dataset_size: 529164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for German financial_phrasebank
## Dataset Description
### Dataset Summary
This datset is a German translation of the financial phrasebank of [Malo et al. (2013)](https://arxiv.org/abs/1307.5336) with a minimum agreement rate between annotators of 75% (3453 observations in total). The translation was mechanically accomplished with [Deepl](https://www.deepl.com/translator).
### Supported Tasks and Leaderboards
Sentiment Classification
### Languages
German
## Dataset Structure
### Data Instances
```
{ "sentence": "Die finnische nationale Fluggesellschaft gab an, dass der Nettoverlust in den Monaten April bis Juni 26 Millionen Euro betrug, verglichen mit einem Nettogewinn von 13 Millionen Euro im Vorjahr..",
"label": "negative"
}
```
### Data Fields
- sentence: a tokenized line from the dataset
- label: a label corresponding to the class as a string: 'positive', 'negative' or 'neutral'
### Data Splits
The data is splitted in a train, test and validation set using stratified sampling:
- train (2763 observations)
- validation (344 observations)
- test (346 observations)
## Further Information
For further information regarding the source data or the annotation process, please look at the original [paper](https://arxiv.org/abs/1307.5336) or the original [dataset](https://huggingface.co/datasets/financial_phrasebank).
## Licensing Information
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/.
In particular, this license permits the free use of the data for non-commercial purposes.
If you are interested in commercial use of the data, please contact the authors of the original datset for an appropriate license:
- [Pekka Malo](mailto:pekka.malo@aalto.fi)
- [Ankur Sinha](mailto:ankur.sinha@aalto.fi)
|
CyberHarem/jeanne_d_arc_alter_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jeanne_d_arc_alter/ジャンヌ・ダルク〔オルタ〕/贞德〔Alter〕 (Fate/Grand Order)
This is the dataset of jeanne_d_arc_alter/ジャンヌ・ダルク〔オルタ〕/贞德〔Alter〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `yellow_eyes, breasts, large_breasts, ahoge, white_hair, long_hair, very_long_hair, grey_hair, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 859.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_alter_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 759.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_alter_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1300 | 1.40 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_alter_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jeanne_d_arc_alter_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, armored_dress, black_dress, black_thighhighs, cape, fur_trim, gauntlets, holding_sword, looking_at_viewer, solo, headpiece, standing, smile |
| 1 | 8 |  |  |  |  |  | 1girl, armored_dress, black_dress, black_thighhighs, flag, gauntlets, headpiece, looking_at_viewer, solo, chain, holding_sword, fur-trimmed_cape, banner, black_cape, fur_collar, grin |
| 2 | 6 |  |  |  |  |  | 1girl, armored_dress, black_dress, black_thighhighs, gauntlets, looking_at_viewer, smile, solo, cape, chain, fur_trim, headpiece, sitting |
| 3 | 13 |  |  |  |  |  | 1girl, armored_dress, bare_shoulders, fur_trim, looking_at_viewer, solo, black_gloves, black_thighhighs, chain, clothing_cutout, headpiece, cleavage, elbow_gloves, holding_sword, smile, black_dress, flag, gauntlets, armored_boots, medium_breasts |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, elbow_gloves, looking_at_viewer, official_alternate_costume, solo, cleavage, hair_flower, purple_dress, strapless_dress, black_thighhighs, neck_ribbon, red_ribbon, smile, blush, collarbone, holding, petals, ribbon_choker, rose, sitting, black_dress, closed_mouth, drinking_glass, hair_between_eyes, open_mouth, purple_flower |
| 5 | 14 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, official_alternate_costume, short_dress, solo, jacket, long_sleeves, fur-trimmed_sleeves, smile, open_coat, fur-trimmed_coat, flag, holding_sword, cowboy_shot, fire, hair_between_eyes, knee_boots, medium_breasts, open_mouth, simple_background |
| 6 | 12 |  |  |  |  |  | 1girl, black_dress, official_alternate_costume, solo, jacket, looking_at_viewer, necklace, collarbone, fur-trimmed_coat, short_dress, long_sleeves, open_coat, blush, cleavage, hair_between_eyes, blue_coat, closed_mouth, smile, thighs |
| 7 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, black_panties, choker, cleavage, collarbone, lingerie, looking_at_viewer, navel, solo, black_thighhighs, garter_belt, hair_between_eyes, babydoll, blush, cosplay, necklace, see-through, thighs, closed_mouth, simple_background, smile, stomach |
| 8 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, navel, solo, underwear_only, blush, hair_between_eyes, lingerie, stomach, black_panties, lace-trimmed_bra, thighs, tsurime, black_bra, closed_mouth, cowboy_shot, lace-trimmed_panties |
| 9 | 6 |  |  |  |  |  | 1girl, black_bikini, black_gloves, black_jacket, choker, cleavage, hair_between_eyes, holding_sword, katana, looking_at_viewer, navel, o-ring_bikini, shrug_(clothing), solo, cropped_jacket, long_sleeves, thigh_strap, cowboy_shot, mouth_hold, o-ring_bottom, red_thighhighs, simple_background, single_thighhigh, unsheathed, white_background |
| 10 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_bikini, blush, cleavage, looking_at_viewer, official_alternate_costume, solo, thighs, necklace, bracelet, choker, sarong, twin_braids, collarbone, hair_between_eyes, navel, beach, closed_mouth, o-ring, outdoors, sitting, smile, water |
| 11 | 9 |  |  |  |  |  | 1girl, long_sleeves, solo, blush, looking_at_viewer, collared_shirt, pleated_skirt, white_shirt, black_skirt, open_jacket, closed_mouth, school_uniform, black_jacket, collarbone, red_necktie, smile, thighs |
| 12 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_leotard, cleavage, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, blush, strapless_leotard, covered_navel, detached_collar, highleg_leotard, fishnet_pantyhose, simple_background, thighs, white_background, wrist_cuffs, hair_between_eyes |
| 13 | 10 |  |  |  |  |  | 1girl, solo, hair_flower, looking_at_viewer, black_kimono, floral_print, holding, obi, wide_sleeves, blush, closed_mouth, fur_collar, long_sleeves, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armored_dress | black_dress | black_thighhighs | cape | fur_trim | gauntlets | holding_sword | looking_at_viewer | solo | headpiece | standing | smile | flag | chain | fur-trimmed_cape | banner | black_cape | fur_collar | grin | sitting | bare_shoulders | black_gloves | clothing_cutout | cleavage | elbow_gloves | armored_boots | medium_breasts | official_alternate_costume | hair_flower | purple_dress | strapless_dress | neck_ribbon | red_ribbon | blush | collarbone | holding | petals | ribbon_choker | rose | closed_mouth | drinking_glass | hair_between_eyes | open_mouth | purple_flower | short_dress | jacket | long_sleeves | fur-trimmed_sleeves | open_coat | fur-trimmed_coat | cowboy_shot | fire | knee_boots | simple_background | necklace | blue_coat | thighs | black_panties | choker | lingerie | navel | garter_belt | babydoll | cosplay | see-through | stomach | underwear_only | lace-trimmed_bra | tsurime | black_bra | lace-trimmed_panties | black_bikini | black_jacket | katana | o-ring_bikini | shrug_(clothing) | cropped_jacket | thigh_strap | mouth_hold | o-ring_bottom | red_thighhighs | single_thighhigh | unsheathed | white_background | bracelet | sarong | twin_braids | beach | o-ring | outdoors | water | collared_shirt | pleated_skirt | white_shirt | black_skirt | open_jacket | school_uniform | red_necktie | black_leotard | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | covered_navel | detached_collar | highleg_leotard | fishnet_pantyhose | wrist_cuffs | black_kimono | floral_print | obi | wide_sleeves | upper_body |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------|:--------------|:-------------------|:-------|:-----------|:------------|:----------------|:--------------------|:-------|:------------|:-----------|:--------|:-------|:--------|:-------------------|:---------|:-------------|:-------------|:-------|:----------|:-----------------|:---------------|:------------------|:-----------|:---------------|:----------------|:-----------------|:-----------------------------|:--------------|:---------------|:------------------|:--------------|:-------------|:--------|:-------------|:----------|:---------|:----------------|:-------|:---------------|:-----------------|:--------------------|:-------------|:----------------|:--------------|:---------|:---------------|:----------------------|:------------|:-------------------|:--------------|:-------|:-------------|:--------------------|:-----------|:------------|:---------|:----------------|:---------|:-----------|:--------|:--------------|:-----------|:----------|:--------------|:----------|:-----------------|:-------------------|:----------|:------------|:-----------------------|:---------------|:---------------|:---------|:----------------|:-------------------|:-----------------|:--------------|:-------------|:----------------|:-----------------|:-------------------|:-------------|:-------------------|:-----------|:---------|:--------------|:--------|:---------|:-----------|:--------|:-----------------|:----------------|:--------------|:--------------|:--------------|:-----------------|:--------------|:----------------|:-------------------|:----------------|:--------------|:--------------------|:----------------|:------------------|:------------------|:--------------------|:--------------|:---------------|:---------------|:------|:---------------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | | X | X | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | | | | | X | X | | | X | | | | | | | | X | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | | X | | | | | X | X | X | | | X | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | X | | | | | | X | X | | | X | | | | | | | | | | | | X | | | | X | | | | | | X | X | | | | | X | | X | | | X | X | X | | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | X | | | | | X | X | | | X | | | | | | | | | X | X | | X | | | | | | | | | | X | X | | | | | X | | X | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | | | | X | X | | | | | | | | | | | | X | | | X | | | | | | | | | | X | X | | | | | X | | X | | | | | | | | | X | | | | | | X | X | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | | | | X | X | X | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | X | | | | X | | | X | | | | | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | | | | | | | X | X | | | X | | | | | | | | X | X | | | X | | | | X | | | | | | X | X | | | | | X | | X | | | | | | | | | | | | | X | | X | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | X | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | | | | | | | | X | X | | | | | | | | | | | | X | | | X | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | |
| 13 | 10 |  |  |  |  |  | X | | | | | | | | X | X | | | | | | | | | X | | | | | | | | | | | X | | | | | X | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
pszemraj/HC3-textgen-qa | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- chatgpt
- conversation
source_datasets: Hello-SimpleAI/HC3
pretty_name: HC3 for QA textgen
---
# HC3-textgen-qa
- the `Hello-SimpleAI/HC3` reformatted for textgen
- special tokens for question/answer, see dataset preview |
open-llm-leaderboard/details_maywell__kiqu-70b | ---
pretty_name: Evaluation run of maywell/kiqu-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/kiqu-70b](https://huggingface.co/maywell/kiqu-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__kiqu-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T20:08:41.581583](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__kiqu-70b/blob/main/results_2024-02-18T20-08-41.581583.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7477501453776264,\n\
\ \"acc_stderr\": 0.02896486498215866,\n \"acc_norm\": 0.7510302775108164,\n\
\ \"acc_norm_stderr\": 0.029526287642327367,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6348399492810686,\n\
\ \"mc2_stderr\": 0.014841108878766373\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360134\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6943835889265086,\n\
\ \"acc_stderr\": 0.004597265399568739,\n \"acc_norm\": 0.8794064927305317,\n\
\ \"acc_norm_stderr\": 0.0032498873947065057\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062228,\n\
\ \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062228\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n\
\ \"acc_stderr\": 0.026983346503309347,\n \"acc_norm\": 0.8819444444444444,\n\
\ \"acc_norm_stderr\": 0.026983346503309347\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7446808510638298,\n \"acc_stderr\": 0.028504856470514258,\n\
\ \"acc_norm\": 0.7446808510638298,\n \"acc_norm_stderr\": 0.028504856470514258\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.025680564640056882,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.025680564640056882\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8741935483870967,\n\
\ \"acc_stderr\": 0.018865834288030008,\n \"acc_norm\": 0.8741935483870967,\n\
\ \"acc_norm_stderr\": 0.018865834288030008\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360756,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360756\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7769230769230769,\n \"acc_stderr\": 0.02110773012724401,\n \
\ \"acc_norm\": 0.7769230769230769,\n \"acc_norm_stderr\": 0.02110773012724401\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281733,\n \"\
acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"\
acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7129629629629629,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407387,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407387\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8952745849297573,\n\
\ \"acc_stderr\": 0.010949664098633358,\n \"acc_norm\": 0.8952745849297573,\n\
\ \"acc_norm_stderr\": 0.010949664098633358\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575263,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575263\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7139664804469273,\n\
\ \"acc_stderr\": 0.015113972129062136,\n \"acc_norm\": 0.7139664804469273,\n\
\ \"acc_norm_stderr\": 0.015113972129062136\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.021670058885510803,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.021670058885510803\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062065,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062065\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.589960886571056,\n\
\ \"acc_stderr\": 0.012561837621962026,\n \"acc_norm\": 0.589960886571056,\n\
\ \"acc_norm_stderr\": 0.012561837621962026\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436336,\n \
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436336\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6348399492810686,\n\
\ \"mc2_stderr\": 0.014841108878766373\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571746\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.012799353675801825\n }\n}\n```"
repo_url: https://huggingface.co/maywell/kiqu-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|arc:challenge|25_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|gsm8k|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hellaswag|10_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-08-41.581583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T20-08-41.581583.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- '**/details_harness|winogrande|5_2024-02-18T20-08-41.581583.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T20-08-41.581583.parquet'
- config_name: results
data_files:
- split: 2024_02_18T20_08_41.581583
path:
- results_2024-02-18T20-08-41.581583.parquet
- split: latest
path:
- results_2024-02-18T20-08-41.581583.parquet
---
# Dataset Card for Evaluation run of maywell/kiqu-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/kiqu-70b](https://huggingface.co/maywell/kiqu-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__kiqu-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T20:08:41.581583](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__kiqu-70b/blob/main/results_2024-02-18T20-08-41.581583.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7477501453776264,
"acc_stderr": 0.02896486498215866,
"acc_norm": 0.7510302775108164,
"acc_norm_stderr": 0.029526287642327367,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6348399492810686,
"mc2_stderr": 0.014841108878766373
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.01310678488360134
},
"harness|hellaswag|10": {
"acc": 0.6943835889265086,
"acc_stderr": 0.004597265399568739,
"acc_norm": 0.8794064927305317,
"acc_norm_stderr": 0.0032498873947065057
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062228,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062228
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.026983346503309347,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.026983346503309347
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7446808510638298,
"acc_stderr": 0.028504856470514258,
"acc_norm": 0.7446808510638298,
"acc_norm_stderr": 0.028504856470514258
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8741935483870967,
"acc_stderr": 0.018865834288030008,
"acc_norm": 0.8741935483870967,
"acc_norm_stderr": 0.018865834288030008
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360756,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7769230769230769,
"acc_stderr": 0.02110773012724401,
"acc_norm": 0.7769230769230769,
"acc_norm_stderr": 0.02110773012724401
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281733,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407387,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407387
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8952745849297573,
"acc_stderr": 0.010949664098633358,
"acc_norm": 0.8952745849297573,
"acc_norm_stderr": 0.010949664098633358
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575263,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575263
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7139664804469273,
"acc_stderr": 0.015113972129062136,
"acc_norm": 0.7139664804469273,
"acc_norm_stderr": 0.015113972129062136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.021670058885510803,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.021670058885510803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062065,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062065
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.589960886571056,
"acc_stderr": 0.012561837621962026,
"acc_norm": 0.589960886571056,
"acc_norm_stderr": 0.012561837621962026
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.01575052628436336,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.01575052628436336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6348399492810686,
"mc2_stderr": 0.014841108878766373
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571746
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ASCCCCCCCC/amazon_zh_simple | ---
license: apache-2.0
---
|
zicsx/C4-Hindi-Cleaned | ---
language:
- hi
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19615771517.59057
num_examples: 6611315
download_size: 15187583565
dataset_size: 19615771517.59057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "C4-Hindi-Cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
averageandyyy/part_1_imda_100k | ---
dataset_info:
features:
- name: transcript
dtype: string
- name: path
dtype: string
- name: waveform
sequence: float64
splits:
- name: train
num_bytes: 67028014130.12406
num_examples: 100000
download_size: 16182561860
dataset_size: 67028014130.12406
---
# Dataset Card for "part_1_imda_100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hakurei/open-instruct-v1 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 100K<n<1M
---
# Open Instruct V1 - A dataset for having LLMs follow instructions.
Open Instruct V1 is an amalgamation of different datasets which are cleaned and then collated into a singular format for training.
## Dataset Breakdown
| Dataset | Amount of Samples |
|----------------|-------------------|
| [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 51759 |
| [Self Instruct](https://github.com/yizhongw/self-instruct) | 82599 |
| [GPT-4 Instruct](https://github.com/teknium1/GPTeacher) | 18194 |
| [Code Alpaca](https://huggingface.co/datasets/HuggingFaceH4/CodeAlpaca_20K) | 18019 |
| [Dolly](https://huggingface.co/datasets/HuggingFaceH4/databricks_dolly_15k) | 15015 |
| [Synthetic](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise) | 33143 |
| [Roleplay](https://github.com/teknium1/GPTeacher) | 3146 |
| [asss](https://huggingface.co/datasets/HuggingFaceH4/asss) | 448 |
| [instruction-dataset](https://huggingface.co/datasets/HuggingFaceH4/instruction-dataset) | 327 |
| Total | 222650 |
|
Maheswari001/finetuned-indian-food | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1429041721.5834336
num_examples: 5328
- name: test
num_bytes: 282780411.3925666
num_examples: 941
download_size: 1601613602
dataset_size: 1711822132.9760003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
antoinelouis/msmarco-dev-small-negatives | ---
license: mit
size_categories: 1K<n<10K
task_categories:
- text-retrieval
task_ids:
- document-retrieval
tags:
- MS MARCO
configs:
- config_name: negatives
data_files:
- split: colbertv2
path: colbertv2-negatives.dev.small.jsonl
---
[Under Construction]
This repository contains up to 1000 hard negatives from several retrieval systems for the 6,980 queries of the MS MARCO small dev set. This data can be used to evaluate
the performance of reranking models for 2nd-stage retrieval given a set of 1000 candidate passages (including the positive ones). |
jaimin/Image_Caption | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4761482918.916
num_examples: 15012
download_size: 4603698692
dataset_size: 4761482918.916
---
# Dataset Card for "Image_Caption"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tomc43841/public_smash_medium_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: Start
dtype: bool
- name: A
dtype: bool
- name: B
dtype: bool
- name: X
dtype: bool
- name: Y
dtype: bool
- name: Z
dtype: bool
- name: DPadUp
dtype: bool
- name: DPadDown
dtype: bool
- name: DPadLeft
dtype: bool
- name: DPadRight
dtype: bool
- name: L
dtype: bool
- name: R
dtype: bool
- name: LPressure
dtype: int64
- name: RPressure
dtype: int64
- name: XAxis
dtype: int64
- name: YAxis
dtype: int64
- name: CXAxis
dtype: int64
- name: CYAxis
dtype: int64
splits:
- name: train
num_bytes: 4538861502.929
num_examples: 44679
download_size: 4014903867
dataset_size: 4538861502.929
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FSMBench/fsmbench_what_will_be_the_state_12K | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: fsm_id
dtype: string
- name: fsm_json
dtype: string
- name: difficulty_level
dtype: int64
- name: transition_matrix
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: substring_index
dtype: int64
- name: number_of_states
dtype: int64
- name: number_of_alphabets
dtype: int64
- name: state_alpha_combo
dtype: string
splits:
- name: validation
num_bytes: 29009393
num_examples: 12800
download_size: 1193868
dataset_size: 29009393
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
mfidabel/nicoespb-art | ---
license: creativeml-openrail-m
---
|
MayaMhemar/abiturients | ---
task_categories:
- question-answering
- text-generation
language:
- uk
tags:
- university
- graduates
size_categories:
- n<1K
--- |
irds/lotte_writing_dev | ---
pretty_name: '`lotte/writing/dev`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/writing/dev`
The `lotte/writing/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/writing/dev).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=277,072
This dataset is used by: [`lotte_writing_dev_forum`](https://huggingface.co/datasets/irds/lotte_writing_dev_forum), [`lotte_writing_dev_search`](https://huggingface.co/datasets/irds/lotte_writing_dev_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_writing_dev', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
CyberHarem/xiangling_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xiangling/香菱/香菱 (Genshin Impact)
This is the dataset of xiangling/香菱/香菱 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `blue_hair, short_hair, braid, yellow_eyes, hair_rings, hair_ornament, hairclip, braided_hair_rings, breasts, twin_braids, thick_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 919.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xiangling_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 769.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xiangling_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1299 | 1.47 GiB | [Download](https://huggingface.co/datasets/CyberHarem/xiangling_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/xiangling_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | chinese_clothes, dress, open_mouth, 2girls, :d, black_gloves, fingerless_gloves, sleeveless, solo_focus, vision_(genshin_impact) |
| 1 | 8 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, looking_at_viewer, sleeveless, solo, china_dress, holding_polearm, simple_background, vision_(genshin_impact), white_background, bell, smile, sidelocks |
| 2 | 14 |  |  |  |  |  | 1girl, fingerless_gloves, looking_at_viewer, solo, bandaid_on_knee, black_gloves, china_dress, holding_polearm, vision_(genshin_impact), bare_shoulders, bell, sleeveless_dress, small_breasts, boots, simple_background, thigh_strap, thighs, open_mouth, white_background, :d, blush, closed_mouth, fire, pelvic_curtain, standing |
| 3 | 8 |  |  |  |  |  | 1girl, black_gloves, food, holding_plate, looking_at_viewer, small_breasts, solo, china_dress, open_mouth, vision_(genshin_impact), bare_shoulders, fingerless_gloves, :d, standing_on_one_leg, thigh_strap, thighs, blush, outdoors, sleeveless_dress |
| 4 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, china_dress, closed_mouth, fingerless_gloves, small_breasts, solo, looking_at_viewer, simple_background, blush, sleeveless_dress, white_background, smile, thigh_strap, thighs, vision_(genshin_impact), bell, pelvic_curtain |
| 5 | 7 |  |  |  |  |  | 1girl, alternate_costume, open_mouth, solo, blush, looking_at_viewer, simple_background, white_background, long_sleeves, white_shirt, :d, brown_skirt, school_uniform, short_sleeves |
| 6 | 10 |  |  |  |  |  | 2boys, chinese_clothes, long_sleeves, male_focus, frilled_sleeves, single_earring, tassel_earrings, holding, smile, shorts, solo_focus, 1boy, closed_mouth, food, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | chinese_clothes | dress | open_mouth | 2girls | :d | black_gloves | fingerless_gloves | sleeveless | solo_focus | vision_(genshin_impact) | 1girl | looking_at_viewer | solo | china_dress | holding_polearm | simple_background | white_background | bell | smile | sidelocks | bandaid_on_knee | bare_shoulders | sleeveless_dress | small_breasts | boots | thigh_strap | thighs | blush | closed_mouth | fire | pelvic_curtain | standing | food | holding_plate | standing_on_one_leg | outdoors | alternate_costume | long_sleeves | white_shirt | brown_skirt | school_uniform | short_sleeves | 2boys | male_focus | frilled_sleeves | single_earring | tassel_earrings | holding | shorts | 1boy | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------|:--------|:-------------|:---------|:-----|:---------------|:--------------------|:-------------|:-------------|:--------------------------|:--------|:--------------------|:-------|:--------------|:------------------|:--------------------|:-------------------|:-------|:--------|:------------|:------------------|:-----------------|:-------------------|:----------------|:--------|:--------------|:---------|:--------|:---------------|:-------|:-----------------|:-----------|:-------|:----------------|:----------------------|:-----------|:--------------------|:---------------|:--------------|:--------------|:-----------------|:----------------|:--------|:-------------|:------------------|:-----------------|:------------------|:----------|:---------|:-------|:---------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | | | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | | | X | | X | X | X | | | X | X | X | X | X | | | | | | | | X | X | X | | X | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | | | | | | X | X | | | X | X | X | X | X | | X | X | X | X | | | X | X | X | | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | | | X | | X | | | | | | X | X | X | | | X | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | X | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X |
|
R-Arfin/Depression | ---
language:
- en
tags:
- code
- machine learning
- Derpression
- Depression Detection
- Depression-Dataset
size_categories:
- 1M<n<10M
--- |
allenai/WildChat | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: model
dtype: string
- name: timestamp
dtype: timestamp[s, tz=UTC]
- name: conversation
list:
- name: content
dtype: string
- name: language
dtype: string
- name: redacted
dtype: bool
- name: role
dtype: string
- name: toxic
dtype: bool
- name: turn
dtype: int64
- name: language
dtype: string
- name: openai_moderation
list:
- name: categories
struct:
- name: harassment
dtype: bool
- name: harassment/threatening
dtype: bool
- name: hate
dtype: bool
- name: hate/threatening
dtype: bool
- name: self-harm
dtype: bool
- name: self-harm/instructions
dtype: bool
- name: self-harm/intent
dtype: bool
- name: sexual
dtype: bool
- name: sexual/minors
dtype: bool
- name: violence
dtype: bool
- name: violence/graphic
dtype: bool
- name: category_scores
struct:
- name: harassment
dtype: float64
- name: harassment/threatening
dtype: float64
- name: hate
dtype: float64
- name: hate/threatening
dtype: float64
- name: self-harm
dtype: float64
- name: self-harm/instructions
dtype: float64
- name: self-harm/intent
dtype: float64
- name: sexual
dtype: float64
- name: sexual/minors
dtype: float64
- name: violence
dtype: float64
- name: violence/graphic
dtype: float64
- name: flagged
dtype: bool
- name: detoxify_moderation
list:
- name: identity_attack
dtype: float32
- name: insult
dtype: float32
- name: obscene
dtype: float32
- name: severe_toxicity
dtype: float32
- name: sexual_explicit
dtype: float32
- name: threat
dtype: float32
- name: toxicity
dtype: float32
- name: toxic
dtype: bool
- name: redacted
dtype: bool
splits:
- name: train
num_bytes: 3900538458
num_examples: 652139
download_size: 2102684185
dataset_size: 3900538458
pretty_name: WildChat
extra_gated_prompt: >-
Access to this dataset is automatically granted upon accepting the [**AI2
ImpACT License - Low Risk Artifacts (“LR
Agreement”)**](https://allenai.org/licenses/impact-lr) and completing all
fields below.
extra_gated_fields:
Your full name: text
Organization or entity you are affiliated with: text
State or country you are located in: text
Contact email: text
Please describe your intended use of the low risk artifact(s): text
I AGREE to the terms and conditions of the LR Agreement above: checkbox
I AGREE to AI2’s use of my information for legal notices and administrative matters: checkbox
I CERTIFY that the information I have provided is true and accurate: checkbox
tags:
- not-for-all-audiences
- instruction-finetuning
size_categories:
- 100K<n<1M
task_categories:
- conversational
- text-generation
- question-answering
---
# Dataset Card for WildChat
## Dataset Description
- **Paper:** https://openreview.net/forum?id=Bl8u7ZRlbM
- **License:** https://allenai.org/licenses/impact-lr
- **Language(s) (NLP):** multi-lingual
- **Point of Contact:** [Yuntian Deng](mailto:yuntiand@allenai.org)
### Dataset Summary
WildChat is a collection of 650K conversations between human users and ChatGPT. We collected WildChat by offering online users free access to OpenAI's GPT-3.5 and GPT-4. The dataset contains a broad spectrum of user-chatbot interactions that are not previously covered by other instruction fine-tuning datasets: for example, interactions include ambiguous user requests, code-switching, topic-switching, political discussions, etc. WildChat can serve both as a dataset for instructional fine-tuning and as a valuable resource for studying user behaviors. Note that this dataset contains toxic user inputs/ChatGPT responses. A nontoxic subset of this dataest can be found [here](https://huggingface.co/datasets/allenai/WildChat-nontoxic).
WildChat has been openly released under AI2's ImpACT license as a low-risk artifact. The use of WildChat to cause harm is strictly prohibited.
### Languages
66 languages were detected in WildChat.
### Personal and Sensitive Information
The data has been de-identified with Microsoft Presidio and hand-written rules by the authors.
### Data Fields
- `conversation_id` (string): Each conversation has a unique id.
- `model` (string): The underlying OpenAI model, such as gpt-3.5-turbo or gpt-4.
- `timestamp` (timestamp): The timestamp of the last turn in the conversation in UTC.
- `conversation` (list): A list of user/assistant utterances. Each utterance is a dictionary containing the `role` of the speaker (user or assistant), the `content` of the utterance, the detected `language` of the utterance, whether the content of the utterance is considered `toxic`, and whether PII has been detected and anonymized (`redacted`).
- `turn` (int): The number of turns in the conversation. A turn refers to one round of user-assistant interaction.
- `language` (string): The language of the conversation. Note that this is the most frequently detected language in the utterances of the conversation.
- `openai_moderation` (list): A list of OpenAI Moderation results. Each element in the list corresponds to one utterance in the conversation.
- `detoxify_moderation` (list): A list of Detoxify results. Each element in the list corresponds to one utterance in the conversation.
- `toxic` (bool): Whether this conversation contains any utterances considered to be toxic by either OpenAI Moderation or Detoxify.
- `redacted` (bool): Whether this conversation contains any utterances in which PII is detected and anonymized.
### Empty User Inputs
This dataset includes a small subset of conversations where users submitted empty inputs, sometimes leading to hallucinated responses from the assistant. This issue, first noticed by @yuchenlin, arises from the design of our Huggingface chatbot used for data collection, which did not restrict the submission of empty inputs. As a result, users could submit without entering any text, causing the assistant to generate responses without any user prompts. This occurs in a small fraction of the dataset---12,405 out of 652,139 conversations.
### Licensing Information
WildChat is made available under the [**AI2
ImpACT License - Low Risk Artifacts ("LR
Agreement")**](https://allenai.org/licenses/impact-lr)
### Citation Information
Please consider citing [our paper](https://openreview.net/forum?id=Bl8u7ZRlbM) if you find this dataset useful:
```
@inproceedings{
zhao2024inthewildchat,
title={(InThe)WildChat: 570K Chat{GPT} Interaction Logs In The Wild},
author={Zhao, Wenting and Ren, Xiang and Hessel, Jack and Cardie, Claire and Choi, Yejin and Deng, Yuntian},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=Bl8u7ZRlbM}
}
``` |
dim/ru_turbo_saiga_3k | ---
license: mit
dataset_info:
features:
- name: messages
sequence:
- name: role
dtype: string
- name: content
dtype: string
- name: seed
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 6765306.998239436
num_examples: 3000
download_size: 3091422
dataset_size: 6765306.998239436
---
|
jwigginton/eps-trend-sp500 | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: current_qtr
dtype: string
- name: current_estimate_current_qtr
dtype: float64
- name: next_qtr
dtype: string
- name: current_estimate_next_qtr
dtype: float64
- name: current_year
dtype: int64
- name: current_estimate_current_year
dtype: float64
- name: next_year
dtype: int64
- name: current_estimate_next_year
dtype: float64
- name: 7_days_ago_current_qtr
dtype: float64
- name: 7_days_ago_next_qtr
dtype: float64
- name: 7_days_ago_current_year
dtype: float64
- name: 7_days_ago_next_year
dtype: float64
- name: 30_days_ago_current_qtr
dtype: float64
- name: 30_days_ago_next_qtr
dtype: float64
- name: 30_days_ago_current_year
dtype: float64
- name: 30_days_ago_next_year
dtype: float64
- name: 60_days_ago_current_qtr
dtype: float64
- name: 60_days_ago_next_qtr
dtype: float64
- name: 60_days_ago_current_year
dtype: float64
- name: 60_days_ago_next_year
dtype: float64
- name: 90_days_ago_current_qtr
dtype: float64
- name: 90_days_ago_next_qtr
dtype: float64
- name: 90_days_ago_current_year
dtype: float64
- name: 90_days_ago_next_year
dtype: float64
splits:
- name: train
num_bytes: 220508
num_examples: 997
download_size: 90843
dataset_size: 220508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GokhanAI/OPENHERMES1M | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: prompt
dtype: string
- name: message_send
dtype: string
- name: prompt_id
dtype: int64
splits:
- name: test
num_bytes: 41014716
num_examples: 10291
- name: train
num_bytes: 4125437184
num_examples: 1018744
download_size: 2242144464
dataset_size: 4166451900
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_invariant_tag_can_or_not | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 455
num_examples: 6
- name: test
num_bytes: 378
num_examples: 6
- name: train
num_bytes: 6957
num_examples: 94
download_size: 9267
dataset_size: 7790
---
# Dataset Card for "MULTI_VALUE_cola_invariant_tag_can_or_not"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mila-intel/ProtST-EnzymeCommission | ---
configs:
- config_name: default
data_files:
- split: train
path: enzyme_commission_train.csv
- split: validation
path: enzyme_commission_valid.csv
- split: test
path: enzyme_commission_test.csv
--- |
MAli-Farooq/Derm-T2IM-Dataset | ---
license: mit
task_categories:
- text-to-image
language:
- en
tags:
- medical
- code
pretty_name: 'DERM-T2IM Skin Lesion Dataset '
size_categories:
- 1K<n<10K
---
1. The Dataset6K folder consist of two sub folders which includes Benign and Malignant data samples each having 3k data samples.
2. The Smart transformation folder consist of three subfolders which inlcudes tiny benign mole, large malignant moles and multiple moles each having advanced skin lesion augmentation results.
3. If you need to generate more data using Derm-T2IM model it can done by uploading the Derm-T2IM model on stable diffusion GUI which can be cloned from below Github Repo.
Link: https://github.com/AUTOMATIC1111/stable-diffusion-webui
|
doceoSoftware/docvqa_clicars_histmant_Mireia_120_4 | ---
dataset_info:
features:
- name: image
dtype: image
- name: query
sequence: string
- name: answers
sequence: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 6762481.0
num_examples: 120
- name: test
num_bytes: 179724.0
num_examples: 4
download_size: 5488007
dataset_size: 6942205.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
unpackableorange/StyleDrop-weights | ---
license: mit
---
|
BuroIdentidadDigital/Ine_Reverso | ---
license: c-uda
---
|
SEACrowd/sentiment_nathasa_review | ---
license: unknown
tags:
- sentiment-analysis
language:
- ind
---
# sentiment_nathasa_review
Customer Review (Natasha Skincare) is a customers emotion dataset, with amounted to 19,253 samples with the division for each class is 804 joy, 43 surprise, 154 anger, 61 fear, 287 sad, 167 disgust, and 17736 no-emotions.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@article{nurlaila2018classification,
title={CLASSIFICATION OF CUSTOMERS EMOTION USING NA{"I}VE BAYES CLASSIFIER (Case Study: Natasha Skin Care)},
author={Nurlaila, Afifah and Wiranto, Wiranto and Saptono, Ristu},
journal={ITSMART: Jurnal Teknologi dan Informasi},
volume={6},
number={2},
pages={92--97},
year={2018}
}
```
## License
Unknown
## Homepage
[https://jurnal.uns.ac.id/itsmart/article/viewFile/17328/15082](https://jurnal.uns.ac.id/itsmart/article/viewFile/17328/15082)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
epptt/erukaLabels | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.json"
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
deepinv/degradations | ---
license: bsd-3-clause
---
|
WillHeld/wmt19-valid-only-ru_en | ---
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- ru
- en
splits:
- name: validation
num_bytes: 1085596
num_examples: 3000
download_size: 605574
dataset_size: 1085596
---
# Dataset Card for "wmt19-valid-only-ru_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_EleutherAI__pythia-6.7b | ---
pretty_name: Evaluation run of EleutherAI/pythia-6.7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-6.7b](https://huggingface.co/EleutherAI/pythia-6.7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-6.7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T21:18:46.645949](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.7b/blob/main/results_2023-10-21T21-18-46.645949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.00034761798968570957,\n \"f1\": 0.04782403523489941,\n\
\ \"f1_stderr\": 0.001192823686148428,\n \"acc\": 0.3289061036768785,\n\
\ \"acc_stderr\": 0.008126220712088333\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570957,\n\
\ \"f1\": 0.04782403523489941,\n \"f1_stderr\": 0.001192823686148428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \
\ \"acc_stderr\": 0.0028227133223877035\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788961\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-6.7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T21_18_46.645949
path:
- '**/details_harness|drop|3_2023-10-21T21-18-46.645949.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T21-18-46.645949.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T21_18_46.645949
path:
- '**/details_harness|gsm8k|5_2023-10-21T21-18-46.645949.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T21-18-46.645949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:34:10.394938.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:34:10.394938.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T21_18_46.645949
path:
- '**/details_harness|winogrande|5_2023-10-21T21-18-46.645949.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T21-18-46.645949.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_34_10.394938
path:
- results_2023-07-19T17:34:10.394938.parquet
- split: 2023_10_21T21_18_46.645949
path:
- results_2023-10-21T21-18-46.645949.parquet
- split: latest
path:
- results_2023-10-21T21-18-46.645949.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-6.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.7b](https://huggingface.co/EleutherAI/pythia-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T21:18:46.645949](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.7b/blob/main/results_2023-10-21T21-18-46.645949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.04782403523489941,
"f1_stderr": 0.001192823686148428,
"acc": 0.3289061036768785,
"acc_stderr": 0.008126220712088333
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.04782403523489941,
"f1_stderr": 0.001192823686148428
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788961
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdityaSingh312/text_to_sql | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 61413399
num_examples: 7000
- name: validation
num_bytes: 7113615
num_examples: 1034
download_size: 3768637
dataset_size: 68527014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/lutzow_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane)
This is the dataset of lutzow/リュッツォウ/吕佐夫 (Azur Lane), containing 68 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, hat, grey_hair, black_headwear, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 125.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 60.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 133.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 105.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 200.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutzow_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lutzow_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_skirt, drill_locks, looking_at_viewer, solo, white_shirt, red_nails, braid, detached_sleeves, nail_polish, non-humanoid_robot, smile, thighhighs, belt, drill_hair, feet_out_of_frame, grey_eyes |
| 1 | 10 |  |  |  |  |  | 1girl, black_skirt, detached_sleeves, looking_at_viewer, solo, long_sleeves, white_shirt, bare_shoulders, black_footwear, mini_hat, open_mouth, black_thighhighs, thigh_boots, braid, one_eye_closed, simple_background, sitting, grey_eyes, high-waist_skirt, medium_hair, stuffed_toy, white_background, ;o, holding, nail_polish, red_nails, yawning |
| 2 | 21 |  |  |  |  |  | looking_at_viewer, red_eyes, 1girl, black_dress, cleavage, official_alternate_costume, solo, thighhighs, white_hair, bare_shoulders, blush, tongue_out, drill_locks, hair_ornament, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | drill_locks | looking_at_viewer | solo | white_shirt | red_nails | braid | detached_sleeves | nail_polish | non-humanoid_robot | smile | thighhighs | belt | drill_hair | feet_out_of_frame | grey_eyes | long_sleeves | bare_shoulders | black_footwear | mini_hat | open_mouth | black_thighhighs | thigh_boots | one_eye_closed | simple_background | sitting | high-waist_skirt | medium_hair | stuffed_toy | white_background | ;o | holding | yawning | red_eyes | black_dress | cleavage | official_alternate_costume | white_hair | blush | tongue_out | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:--------------------|:-------|:--------------|:------------|:--------|:-------------------|:--------------|:---------------------|:--------|:-------------|:-------|:-------------|:--------------------|:------------|:---------------|:-----------------|:-----------------|:-----------|:-------------|:-------------------|:--------------|:-----------------|:--------------------|:----------|:-------------------|:--------------|:--------------|:-------------------|:-----|:----------|:----------|:-----------|:--------------|:-----------|:-----------------------------|:-------------|:--------|:-------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | | X | X | X | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
PriceWang/MAECG | ---
license: mit
---
|
open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B | ---
pretty_name: Evaluation run of paulml/OmniBeagleSquaredMBX-v3-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [paulml/OmniBeagleSquaredMBX-v3-7B](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T23:27:24.007560](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B/blob/main/results_2024-02-09T23-27-24.007560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6570164678793579,\n\
\ \"acc_stderr\": 0.03201361905149607,\n \"acc_norm\": 0.6564830865217572,\n\
\ \"acc_norm_stderr\": 0.03268414092379567,\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7269545808306953,\n\
\ \"mc2_stderr\": 0.01465862803375696\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.7440273037542662,\n \"acc_norm_stderr\": 0.012753013241244527\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7180840470025891,\n\
\ \"acc_stderr\": 0.004490130691020433,\n \"acc_norm\": 0.8881696873132842,\n\
\ \"acc_norm_stderr\": 0.0031451347677023105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7269545808306953,\n\
\ \"mc2_stderr\": 0.01465862803375696\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479651\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.01271440100992365\n }\n}\n```"
repo_url: https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-27-24.007560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-27-24.007560.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- '**/details_harness|winogrande|5_2024-02-09T23-27-24.007560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T23-27-24.007560.parquet'
- config_name: results
data_files:
- split: 2024_02_09T23_27_24.007560
path:
- results_2024-02-09T23-27-24.007560.parquet
- split: latest
path:
- results_2024-02-09T23-27-24.007560.parquet
---
# Dataset Card for Evaluation run of paulml/OmniBeagleSquaredMBX-v3-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [paulml/OmniBeagleSquaredMBX-v3-7B](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T23:27:24.007560](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B/blob/main/results_2024-02-09T23-27-24.007560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6570164678793579,
"acc_stderr": 0.03201361905149607,
"acc_norm": 0.6564830865217572,
"acc_norm_stderr": 0.03268414092379567,
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7269545808306953,
"mc2_stderr": 0.01465862803375696
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7440273037542662,
"acc_norm_stderr": 0.012753013241244527
},
"harness|hellaswag|10": {
"acc": 0.7180840470025891,
"acc_stderr": 0.004490130691020433,
"acc_norm": 0.8881696873132842,
"acc_norm_stderr": 0.0031451347677023105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7269545808306953,
"mc2_stderr": 0.01465862803375696
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479651
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.01271440100992365
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/838a27e4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1341
dataset_size: 180
---
# Dataset Card for "838a27e4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/imdb_misspelled_5 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33631096
num_examples: 25000
- name: validation
num_bytes: 32850598
num_examples: 25000
download_size: 56488953
dataset_size: 66481694
---
# Dataset Card for "imdb_misspelled_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sankethgadadinni/glaive-function-call | ---
license: apache-2.0
---
|
AngoHF/ANGO-S1 | ---
license: llama2
task_categories:
- question-answering
- text2text-generation
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
pretty_name: ANGO
---
ANGO is A Novel Generation-Oriented Chinese LLM evaluation benchmark.
We introduces the format of single-question multiple-keypoints dataset for the first time, which include 171 keypoints accumulated in 4 hierarchical levels and 9 difficulty categories.
The data were exclusively obtained from the Administrative Proficiency Test, which serves as a significant component of the Chinese civil service examination.
We will apply a seasonal system for the leaderboard, updating them every two months. The corresponding test dataset will be announced at the beginning of each season, and some questions will be eliminated at the end of the season.
More details are at our [space](https://huggingface.co/spaces/AngoHF/ANGO-Leaderboard) |
KyS/OCR-Campai | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: Campai.pdf
num_bytes: 123461953.0
num_examples: 5240
download_size: 123383170
dataset_size: 123461953.0
configs:
- config_name: default
data_files:
- split: Campai.pdf
path: data/Campai.pdf-*
---
|
heliosprime/twitter_dataset_1713010496 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8459
num_examples: 19
download_size: 8793
dataset_size: 8459
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713010496"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BAAI/CCI-Data | ---
task_categories:
- text-generation
language:
- zh
size_categories:
- 10M<n<100M
---
## Data Description
With the rapid development of large language models, the demand for high-quality datasets in both the industry and academia is growing. These datasets not only need to contain a vast amount of information but also require rigorous screening and cleaning to ensure their accuracy and the safety of downstream models and applications. However, the currently popular public datasets in the industry have certain quality and security risks, especially in the Chinese domain where high-quality datasets are particularly lacking. Moreover, constructing a safe Chinese dataset also faces many challenges. Therefore, building a dataset that has undergone strict screening and standardized processing is particularly important for the innovation and development of LLMs.
Our CCI (Chinese Corpora Internet) dataset consists of high-quality, trustworthy sources from internet sites within mainland China. It has undergone rigorous data cleaning and deduplication, with targeted detection and filtering in aspects of content quality. The rules for data processing include:
- Rule-based filtering: density-based extraction, keyword filtering, spam information filtering, conversion between simplified and traditional Chinese, etc.
- Model-based filtering: filtering of low-quality content by training a classification model
- Deduplication: within and between datasets dedup
Additionally, in response to the issue of pre-training data being large in scale and prone to causing leaks of evaluation data, we specifically conduct rigorous screening and filtering of several current mainstream Chinese evaluation datasets during the data processing phase.
The CCI corpus released (CCI v1.0.0) is 104GB in size. The overall timespan of the dataset ranges from January 2001 to November 2023.
## Update
- November 29, 2023, CCI v1.0.0 released!
## Data Format
| Field | Type | Meaning |
| :-: | :-: | :-: |
| id | String | Document ID, globally unique |
| title | String | Document title |
| content | String | Content of the document |
## Sample
```json
{
"id": "a262c26c915762ae107019f2797fda03",
"title": "深圳人工智能企业闪耀东京展会",
"content": "拳头产品叫好又叫座 深圳人工智能企业闪耀东京展会 近日在东京举行的日本人工智能展上,由深圳市贸促委组织的深圳人工智能企业展团集中亮相,引起热烈关注。该展会是日本规模最大的人工智能展会,云鲸智能、思谋科技、魔耳智能、格瑞普电池、云译科技等近20家深圳人工智能代表性企业的最新人工智能产品吸引了众多当地专业观众的目光,成为展会上的一抹亮色。企业现场“揽单”,参展成果丰硕深圳市大象机器人科技有限公司是一家由海外留学人才来深创建的专注于机器人研发生产的专精特新企业,本次在东京,该公司重点展示了myCobot协作机器人和仿真宠物猫metacat等公司拳头产品。“参展期间我们接待客户数达到500位以上,有意愿成为分销伙伴、集成商或终端客户的有效意向客户近70人,成效相当不错。……"
}
```
## Download
The CCI dataset is simultaneously open-sourced on the [BAAI DataHub](https://data.baai.ac.cn/data) and Huggingface.
### BAAI DataHub
Users can click the link [CCI Dataset](https://data.baai.ac.cn/details/BAAI-CCI) to view the data files, and click to download.
Note that users need to register on BAAI DataHub to use the data, and filling out a survey questionnaire is required before their first download.
### Huggingface
To use the data, you can load it using the following code:
```python
from datasets import load_dataset
# If the dataset is gated/private, make sure you have run huggingface-cli login
dataset = load_dataset("BAAI/CCI-Data")
```
## User Agreement
Users need to comply with the usage agreement of the CCI dataset. You can view the agreement by clicking on the following link: ([View Usage Agreement](https://data.baai.ac.cn/resources/agreement/cci_usage_aggrement.pdf)).
## Notice
If you have any questions related to this dataset, please contact data@baai.ac.cn. |
thanhduycao/test_4 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: w2v2_transcription
dtype: string
- name: WER
dtype: int64
splits:
- name: train
num_bytes: 1393916.0
num_examples: 18
download_size: 1395008
dataset_size: 1393916.0
---
# Dataset Card for "test_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dong237/empathetic_dialogues_instruction | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: dialogue
dtype: string
splits:
- name: train
num_bytes: 6392746
num_examples: 17780
- name: validation
num_bytes: 1076044
num_examples: 2758
- name: test
num_bytes: 1037401
num_examples: 2540
download_size: 4612892
dataset_size: 8506191
---
# Dataset Card for "empathetic_dialogues_instruction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/congress | ---
language:
- en
tags:
- congress
- tabular_classification
- binary_classification
- UCI
pretty_name: Congress
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- voting
license: cc
---
# Congress
The [Congress dataset](https://archive.ics.uci.edu/ml/datasets/Congress) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Congressmen of two different parties vote on a series of bills. Guess the party of each voter on the basis of their votes.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|---------------------------------------------------------------|
| voting | Binary classification | What's the party of the voter? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/congress", "voting")["train"]
``` |
freshpearYoon/train_free_55 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604535856
num_examples: 10000
download_size: 1372571483
dataset_size: 9604535856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dhmeltzer/flame_tensors | ---
dataset_info:
features:
- name: label
dtype: int64
- name: pixel_values
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 19008408672
num_examples: 31428
- name: validation
num_bytes: 4752706992
num_examples: 7858
- name: test
num_bytes: 5186365800
num_examples: 8575
download_size: 7096810022
dataset_size: 28947481464
---
# Dataset Card for "flame_tensors"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16 | ---
pretty_name: Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T05:46:44.212362](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16/blob/main/results_2023-10-22T05-46-44.212362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2419253355704698,\n\
\ \"em_stderr\": 0.004385673721154169,\n \"f1\": 0.30457843959731623,\n\
\ \"f1_stderr\": 0.00439090225052454,\n \"acc\": 0.38382232120791804,\n\
\ \"acc_stderr\": 0.007195680070781476\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2419253355704698,\n \"em_stderr\": 0.004385673721154169,\n\
\ \"f1\": 0.30457843959731623,\n \"f1_stderr\": 0.00439090225052454\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.0023892815120772092\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485742\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|arc:challenge|25_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T05_46_44.212362
path:
- '**/details_harness|drop|3_2023-10-22T05-46-44.212362.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T05-46-44.212362.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T05_46_44.212362
path:
- '**/details_harness|gsm8k|5_2023-10-22T05-46-44.212362.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T05-46-44.212362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hellaswag|10_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-01T13:56:27.012351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-01T13:56:27.012351.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T05_46_44.212362
path:
- '**/details_harness|winogrande|5_2023-10-22T05-46-44.212362.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T05-46-44.212362.parquet'
- config_name: results
data_files:
- split: 2023_08_01T13_56_27.012351
path:
- results_2023-08-01T13:56:27.012351.parquet
- split: 2023_10_22T05_46_44.212362
path:
- results_2023-10-22T05-46-44.212362.parquet
- split: latest
path:
- results_2023-10-22T05-46-44.212362.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:46:44.212362](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16/blob/main/results_2023-10-22T05-46-44.212362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2419253355704698,
"em_stderr": 0.004385673721154169,
"f1": 0.30457843959731623,
"f1_stderr": 0.00439090225052454,
"acc": 0.38382232120791804,
"acc_stderr": 0.007195680070781476
},
"harness|drop|3": {
"em": 0.2419253355704698,
"em_stderr": 0.004385673721154169,
"f1": 0.30457843959731623,
"f1_stderr": 0.00439090225052454
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772092
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.012002078629485742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mattyhatch/tomatoesTest2 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 174243.0
num_examples: 1
download_size: 23284
dataset_size: 174243.0
---
# Dataset Card for "tomatoesTest2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zjunlp/InstructIE | ---
license: mit
task_categories:
- text2text-generation
language:
- en
- zh
tags:
- information-extraction
- entity
- relation
pretty_name: InstructIE
size_categories:
- 100M<n<1B
---
# InstructIE: A Bilingual Instruction-based Information Extraction Dataset
[Paper](https://doi.org/10.48550/arXiv.2305.11527)
## News
* [2024/02] We released a large-scale (0.32B tokens) high-quality bilingual (Chinese and English) Information Extraction (IE) instruction tuning dataset named [IEPile](https://huggingface.co/datasets/zjunlp/iepie), along with two models trained on `IEPile`, [baichuan2-13b-iepile-lora](https://huggingface.co/zjunlp/baichuan2-13b-iepile-lora) and [llama2-13b-iepile-lora](https://huggingface.co/zjunlp/llama2-13b-iepile-lora).
* [2023/10] We released a new bilingual (Chinese and English) theme-based Information Extraction (IE) instruction dataset named [InstructIE](https://huggingface.co/datasets/zjunlp/InstructIE).
* [2023/08] We introduced a dedicated 13B model for Information Extraction (IE), named [knowlm-13b-ie](https://huggingface.co/zjunlp/knowlm-13b-ie/tree/main).
* [2023/05] We initiated an instruction-based Information Extraction project.
InstructIE is an bilingual information extraction dataset based on topic schemas. We divide the text into 12 topics, namely, Person, Geographic_Location, Building, Works, Creature, Artificial_Object, Natural_Science, Organization, Transport, Event, Astronomy, Medicine. For each topic, we have designed corresponding schemas. We expect the model to learn a general extraction capability on InstructIE and generalize it to other domains.
```
InstrueIE
├── train_zh_old.json # Chinese training set, the dataset used in the paper "InstructIE: A Bilingual Instruction-based Information Extraction Dataset".
├── train_en_old.json # English training set, the dataset used in the paper "InstructIE: A Bilingual Instruction-based Information Extraction Dataset".
├── train_zh.json # Chinese training set enhanced with LLMs.
├── train_en.json # English training set enhanced with LLMs.
├── dev_zh.json # Chinese validation set.
├── dev_en.json # English validation set.
├── test_zh.json # Chinese test set.
├── test_en.json # English test set.
├── schema_zh.json # Schema information for 12 topics in Chinese.
├── schema_en.json # Schema information for 12 topics in English.
├── InstrueIE-zh
│ ├── InstrueIE_人物
│ │ ├── train.json # Subsample of 5000 samples, full samples can be obtained from train_zh.json
│ │ ├── dev.json
│ │ ├── schema.json
│ │ └── test.json
│ ├── InstrueIE_建筑结构
│ ├── InstrueIE_组织
│ ├── InstrueIE_生物
│ ├── ...
├── InstrueIE-en
│ ├── InstrueIE_Person
│ ├── InstrueIE_Creature
```
<b>Example of data</b>
```
{
"id": "841ef2af4cfe766dd9295fb7daf321c299df0fd0cef14820dfcb421161eed4a1",
"text": "NGC1313 is a galaxy in the constellation of Reticulum. It was discovered by the Australian astronomer James Dunlop on September 27, 1826. It has a prominent uneven shape, and its axis does not completely revolve around its center. Near NGC1313, there is another galaxy, NGC1309.",
"relation": [
{"head": "NGC1313", "head_type": "astronomical object type", "relation": "time of discovery", "tail": "September 27, 1826", "tail_type": "time"},
{"head": "NGC1313", "head_type": "astronomical object type", "relation": "discoverer or inventor", "tail": "James Dunlop", "tail_type": "organization/human"},
{"head": "NGC1313", "head_type": "astronomical object type", "relation": "of", "tail": "Reticulum", "tail_type": "astronomical object type"}
]
}
```
| Field | Description |
| ----------- | ---------------------------------------------------------------- |
| id | The unique identifier for each data point. |
| cate | The category of the text's subject, with a total of 12 different thematic categories. |
| text | The input text for the model, with the goal of extracting all the involved relationship triples. |
| relation | Describes the relationship triples contained in the text, i.e., (head, head_type, relation, tail, tail_type). |
With the fields mentioned above, users can flexibly design and implement instructions and output formats for different information extraction needs.
[Tutorial](https://github.com/zjunlp/DeepKE/blob/main/example/llm/InstructKGC/README.md)
## Citation
Please cite these papers if you use InstructIE in your work.
```bibtex
@article{DBLP:journals/corr/abs-2305-11527,
author = {Honghao Gui and
Shuofei Qiao and
Jintian Zhang and
Hongbin Ye and
Mengshu Sun and
Lei Liang and
Huajun Chen and
Ningyu Zhang},
title = {InstructIE: {A} Bilingual Instruction-based Information Extraction
Dataset},
journal = {CoRR},
volume = {abs/2305.11527},
year = {2023},
url = {https://doi.org/10.48550/arXiv.2305.11527},
doi = {10.48550/ARXIV.2305.11527},
eprinttype = {arXiv},
eprint = {2305.11527},
timestamp = {Thu, 22 Feb 2024 09:46:17 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2305-11527.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
sam-mosaic/dolly_chatml | ---
language: en
dataset_info:
features:
- name: text
dtype: string
- name: cat
dtype: string
splits:
- name: train
num_bytes: 11767434
num_examples: 8497
download_size: 5401759
dataset_size: 11767434
---
# Dataset Card for "dolly_chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_D_PNP_GENERIC_Q_rices_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 28448
num_examples: 200
download_size: 13914
dataset_size: 28448
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_D_PNP_GENERIC_Q_rices_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chris5Lin/cc_raw | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 37
num_examples: 4
download_size: 524
dataset_size: 37
---
# Dataset Card for "cc_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrSoul7766/instagram_post_captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2823560631.721
num_examples: 30191
download_size: 3568704441
dataset_size: 2823560631.721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ai-nightcoder/Uzbek_LLM_llama | ---
license: llama2
---
|
manu/trivia_qa_wiki | ---
dataset_info:
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: validation
num_bytes: 430166050
num_examples: 7993
download_size: 234775285
dataset_size: 430166050
---
# Dataset Card for "trivia_qa_wiki_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JoseLuis95/adv-ele | ---
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- text2text-generation
- translation
pretty_name: FromAdvancedtoBasicEnglish
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 292694
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
lionelchg/dolly_information_extraction | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4662489.3625498
num_examples: 1430
- name: test
num_bytes: 247796.6374501992
num_examples: 76
download_size: 2857133
dataset_size: 4910286.0
---
# Dataset Card for "dolly_information_extraction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_computer_science | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 8550
num_examples: 5
- name: test
num_bytes: 421871
num_examples: 100
download_size: 103795
dataset_size: 430421
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-college_computer_science"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Adapting/empathetic_dialogues_v2 | ---
license: afl-3.0
---
Fine-tuned empathetic dialogue datasets from https://huggingface.co/datasets/empathetic_dialogues
With labeled chat history, system response, question or not and behavior.
|
NomiDecent/Open_Dataset | ---
license: llama2
---
|
open-llm-leaderboard/details_occultml__Helios-10.7B-v2 | ---
pretty_name: Evaluation run of occultml/Helios-10.7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [occultml/Helios-10.7B-v2](https://huggingface.co/occultml/Helios-10.7B-v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_occultml__Helios-10.7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T12:23:10.136079](https://huggingface.co/datasets/open-llm-leaderboard/details_occultml__Helios-10.7B-v2/blob/main/results_2024-01-04T12-23-10.136079.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4114274647380307,\n\
\ \"acc_stderr\": 0.034055832181383035,\n \"acc_norm\": 0.41611952976407623,\n\
\ \"acc_norm_stderr\": 0.03500219697159756,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.5550965546640495,\n\
\ \"mc2_stderr\": 0.016601840091756987\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094095,\n\
\ \"acc_norm\": 0.3916382252559727,\n \"acc_norm_stderr\": 0.014264122124938213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.34266082453694485,\n\
\ \"acc_stderr\": 0.004736292355716404,\n \"acc_norm\": 0.46634136626170086,\n\
\ \"acc_norm_stderr\": 0.0049784626909669255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357797,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357797\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918428,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918428\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4870967741935484,\n \"acc_stderr\": 0.02843453315268186,\n \"\
acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.02843453315268186\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271847,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271847\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.036080032255696545,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.036080032255696545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3769230769230769,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.3769230769230769,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214338,\n \"\
acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24537037037037038,\n \"acc_stderr\": 0.029346665094372937,\n \"\
acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.029346665094372937\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39215686274509803,\n \"acc_stderr\": 0.03426712349247271,\n \"\
acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.03426712349247271\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.48523206751054854,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.48523206751054854,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.452914798206278,\n\
\ \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.452914798206278,\n\
\ \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128919,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128919\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5299145299145299,\n\
\ \"acc_stderr\": 0.03269741106812444,\n \"acc_norm\": 0.5299145299145299,\n\
\ \"acc_norm_stderr\": 0.03269741106812444\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5593869731800766,\n\
\ \"acc_stderr\": 0.017753396973908493,\n \"acc_norm\": 0.5593869731800766,\n\
\ \"acc_norm_stderr\": 0.017753396973908493\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.19664804469273742,\n\
\ \"acc_stderr\": 0.013293183027454641,\n \"acc_norm\": 0.19664804469273742,\n\
\ \"acc_norm_stderr\": 0.013293183027454641\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809553,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02728160834446942,\n \
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02728160834446942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2900912646675359,\n\
\ \"acc_stderr\": 0.011590375554733093,\n \"acc_norm\": 0.2900912646675359,\n\
\ \"acc_norm_stderr\": 0.011590375554733093\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.02714627193662517,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.02714627193662517\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.02017061497496978,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.02017061497496978\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.03113088039623593,\n\
\ \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.03113088039623593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5522388059701493,\n\
\ \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.5522388059701493,\n\
\ \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.037439798259264,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.037439798259264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.5550965546640495,\n\
\ \"mc2_stderr\": 0.016601840091756987\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7063930544593529,\n \"acc_stderr\": 0.012799397296204182\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/occultml/Helios-10.7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-23-10.136079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-23-10.136079.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- '**/details_harness|winogrande|5_2024-01-04T12-23-10.136079.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T12-23-10.136079.parquet'
- config_name: results
data_files:
- split: 2024_01_04T12_23_10.136079
path:
- results_2024-01-04T12-23-10.136079.parquet
- split: latest
path:
- results_2024-01-04T12-23-10.136079.parquet
---
# Dataset Card for Evaluation run of occultml/Helios-10.7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [occultml/Helios-10.7B-v2](https://huggingface.co/occultml/Helios-10.7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_occultml__Helios-10.7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T12:23:10.136079](https://huggingface.co/datasets/open-llm-leaderboard/details_occultml__Helios-10.7B-v2/blob/main/results_2024-01-04T12-23-10.136079.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4114274647380307,
"acc_stderr": 0.034055832181383035,
"acc_norm": 0.41611952976407623,
"acc_norm_stderr": 0.03500219697159756,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.5550965546640495,
"mc2_stderr": 0.016601840091756987
},
"harness|arc:challenge|25": {
"acc": 0.35494880546075086,
"acc_stderr": 0.013983036904094095,
"acc_norm": 0.3916382252559727,
"acc_norm_stderr": 0.014264122124938213
},
"harness|hellaswag|10": {
"acc": 0.34266082453694485,
"acc_stderr": 0.004736292355716404,
"acc_norm": 0.46634136626170086,
"acc_norm_stderr": 0.0049784626909669255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357797,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357797
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918428,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918428
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.02843453315268186,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.02843453315268186
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271847,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271847
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.036080032255696545,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.036080032255696545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3769230769230769,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.3769230769230769,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214338,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.029346665094372937,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.029346665094372937
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.03426712349247271,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.03426712349247271
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48523206751054854,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.48523206751054854,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.452914798206278,
"acc_stderr": 0.033408675019233246,
"acc_norm": 0.452914798206278,
"acc_norm_stderr": 0.033408675019233246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128919,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128919
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5299145299145299,
"acc_stderr": 0.03269741106812444,
"acc_norm": 0.5299145299145299,
"acc_norm_stderr": 0.03269741106812444
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5593869731800766,
"acc_stderr": 0.017753396973908493,
"acc_norm": 0.5593869731800766,
"acc_norm_stderr": 0.017753396973908493
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.19664804469273742,
"acc_stderr": 0.013293183027454641,
"acc_norm": 0.19664804469273742,
"acc_norm_stderr": 0.013293183027454641
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809553,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02728160834446942,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02728160834446942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2900912646675359,
"acc_stderr": 0.011590375554733093,
"acc_norm": 0.2900912646675359,
"acc_norm_stderr": 0.011590375554733093
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.02714627193662517,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.02714627193662517
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.02017061497496978,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.02017061497496978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3836734693877551,
"acc_stderr": 0.03113088039623593,
"acc_norm": 0.3836734693877551,
"acc_norm_stderr": 0.03113088039623593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5522388059701493,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.5522388059701493,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.037439798259264,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.037439798259264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.5550965546640495,
"mc2_stderr": 0.016601840091756987
},
"harness|winogrande|5": {
"acc": 0.7063930544593529,
"acc_stderr": 0.012799397296204182
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ACT8113/PomuRainpuffSmallVersion | ---
license: openrail
---
|
OpenDriveLab/OpenScene | ---
license: cc-by-nc-sa-4.0
--- |
Lipe3434/vozdvd | ---
license: openrail
---
|
davanstrien/test_imdb_embedd3 | ---
dataset_info:
features:
- name: act
dtype: string
- name: prompt
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 310201
num_examples: 153
download_size: 398364
dataset_size: 310201
---
# Dataset Card for "test_imdb_embedd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadav/runaway_scans | ---
license: afl-3.0
---
|
arieg/bw_spec_cls_80 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '1039'
'2': '1040'
'3': '1082'
'4': '1083'
'5': '1102'
'6': '1193'
'7': '1195'
'8': '1196'
'9': '1197'
'10': '1270'
'11': '1276'
'12': '1277'
'13': '1278'
'14': '140'
'15': '141'
'16': '1417'
'17': '1427'
'18': '1443'
'19': '1482'
'20': '1510'
'21': '1544'
'22': '1642'
'23': '1644'
'24': '1649'
'25': '1661'
'26': '1663'
'27': '1666'
'28': '1673'
'29': '1680'
'30': '1681'
'31': '1682'
'32': '1683'
'33': '1684'
'34': '1685'
'35': '190'
'36': '193'
'37': '194'
'38': '197'
'39': '2'
'40': '200'
'41': '203'
'42': '204'
'43': '207'
'44': '210'
'45': '211'
'46': '212'
'47': '213'
'48': '255'
'49': '256'
'50': '368'
'51': '424'
'52': '5'
'53': '534'
'54': '540'
'55': '546'
'56': '574'
'57': '615'
'58': '620'
'59': '621'
'60': '625'
'61': '666'
'62': '667'
'63': '676'
'64': '694'
'65': '695'
'66': '714'
'67': '715'
'68': '716'
'69': '718'
'70': '777'
'71': '814'
'72': '821'
'73': '822'
'74': '825'
'75': '853'
'76': '897'
'77': '995'
'78': '997'
'79': '998'
splits:
- name: train
num_bytes: 89716674.4
num_examples: 1600
download_size: 87975685
dataset_size: 89716674.4
---
# Dataset Card for "bw_spec_cls_80"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaliyev/Spotify-audio-features | ---
license: unknown
---
|
bond005/taiga_speech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: normalized
dtype: string
- name: speaker
dtype: string
splits:
- name: train
num_bytes: 124017929971.28
num_examples: 173442
download_size: 119944587686
dataset_size: 124017929971.28
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lamm-mit/leaf-images | ---
license: apache-2.0
---
### LeafGAN: Nature-inspired architected materials using unsupervised deep learning
Reference: Shen, S.C., Buehler, M.J. Nature-inspired Architected materials using unsupervised deep learning. Communications Enginering, 2022, DOI: https://www.nature.com/articles/s44172-022-00037-0
Abstract: Nature-inspired material design is driven by superior properties found in natural architected materials and enabled by recent developments in additive manufacturing and machine learning. Existing approaches to push design beyond biomimicry typically use supervised deep learning algorithms to predict and optimize properties based on experimental or simulation data. However, these methods constrain generated material designs to abstracted labels and to “black box” outputs that are only indirectly manipulable. Here we report an alternative approach using an unsupervised generative adversarial network (GAN) model. Training the model on unlabeled data constructs a latent space free of human intervention, which can then be explored through seeding, image encoding, and vector arithmetic to control specific parameters of de novo generated material designs and to push them beyond training data distributions for broad applicability. We illustrate this end-to-end with new materials inspired by leaf microstructures, showing how biological 2D structures can be used to develop novel architected materials in 2 and 3 dimensions. We further utilize a genetic algorithm to optimize generated microstructures for mechanical properties, operating directly on the latent space. This approach allows for transfer of information across manifestations using the latent space as mediator, opening new avenues for exploration of nature-inspired materials.

Code: https://github.com/lamm-mit/LeafGAN/ |
LeonardoTiger/catalyst | ---
license: openrail
---
|
dev-senolys/themas_dataset | ---
dataset_info:
features:
- name: identifier
dtype: string
- name: categories
dtype: string
- name: article
dtype: string
- name: themas
dtype:
class_label:
names:
'0': agronomy
'1': business
'2': design
'3': digital
'4': environment
'5': learning
'6': medical
'7': people
'8': production
'9': resource
'10': science
'11': security
'12': society
'13': transport
- name: themas_ids
sequence: int64
splits:
- name: train_dataset_themas
num_bytes: 3496795
num_examples: 928
- name: val_dataset_themas
num_bytes: 775166
num_examples: 198
- name: test_dataset_themas
num_bytes: 786072
num_examples: 200
download_size: 2886738
dataset_size: 5058033
---
# Dataset Card for "themas_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
martagrueso/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 295540
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k | ---
pretty_name: Evaluation run of harborwater/open-llama-3b-claude-30k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harborwater/open-llama-3b-claude-30k](https://huggingface.co/harborwater/open-llama-3b-claude-30k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T22:19:50.317589](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k/blob/main/results_2023-12-02T22-19-50.317589.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.021986353297952996,\n\
\ \"acc_stderr\": 0.004039162758110046\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.021986353297952996,\n \"acc_stderr\": 0.004039162758110046\n\
\ }\n}\n```"
repo_url: https://huggingface.co/harborwater/open-llama-3b-claude-30k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T22_19_50.317589
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-19-50.317589.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T22-19-50.317589.parquet'
- config_name: results
data_files:
- split: 2023_12_02T22_19_50.317589
path:
- results_2023-12-02T22-19-50.317589.parquet
- split: latest
path:
- results_2023-12-02T22-19-50.317589.parquet
---
# Dataset Card for Evaluation run of harborwater/open-llama-3b-claude-30k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/harborwater/open-llama-3b-claude-30k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [harborwater/open-llama-3b-claude-30k](https://huggingface.co/harborwater/open-llama-3b-claude-30k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T22:19:50.317589](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k/blob/main/results_2023-12-02T22-19-50.317589.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.021986353297952996,
"acc_stderr": 0.004039162758110046
},
"harness|gsm8k|5": {
"acc": 0.021986353297952996,
"acc_stderr": 0.004039162758110046
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ozanba/final_project_dataset | ---
license: other
---
|
imvladikon/parashoot | ---
task_categories:
- question-answering
language:
- he
---
# ParaShoot
[ParaShoot](https://github.com/omrikeren/ParaShoot): A Hebrew question and answering dataset in the style of [SQuAD](https://arxiv.org/abs/1606.05250), based on articles scraped from Wikipedia. The dataset contains a few thousand crowdsource-annotated pairs of questions and answers, in a setting suitable for few-shot learning.
For more details and quality analysis, see the [paper](https://arxiv.org/abs/2109.11314).
## Dataset Statistics
| **#Items** | **#Articles** | **#Paragraphs** | |
| ---------- | ------------- | --------------- | ------- |
| Train | 1792 | 295 | 565 |
| Dev | 221 | 33 | 63 |
| Test | 1025 | 165 | 319 |
| **Total** | **3038** | **493** | **947** |
## Citing
If you use ParaShoot in your research, please cite the ParaShoot paper:
```bibtex
@inproceedings{keren2021parashoot,
title={ParaShoot: A Hebrew Question Answering Dataset},
author={Keren, Omri and Levy, Omer},
booktitle={Proceedings of the 3rd Workshop on Machine Reading for Question Answering},
pages={106--112},
year={2021}
}
``` |
language-and-voice-lab/samromur_asr | ---
annotations_creators:
- crowdsourced
language:
- is
language_creators:
- crowdsourced
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: "Samrómur Icelandic Speech 1.0."
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- crowd-sourced icelandic
- "samrómur"
- icelandic speech
- samromur
- iceland
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for samromur_asr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Samrómur 21.05]
- **Repository:** [OpenSLR](http://www.openslr.org/112/)
- **Paper:** [Samrómur: Crowd-sourcing Data Collection for Icelandic Speech Recognition](https://aclanthology.org/2020.lrec-1.425.pdf)
- **Point of Contact:** [Jón Guðnason](mailto:jg@ru.is)
### Dataset Summary
This is the first release of the Samrómur Icelandic Speech corpus that contains 100.000 validated utterances.
The corpus is a result of the crowd-sourcing effort run by the Language and Voice Lab at the Reykjavik University, in cooperation with Almannarómur, Center for Language Technology.
### Example Usage
The Samrómur Corpus is divided in 3 splits: train, validation and test. To load a specific split pass its name as a config name:
```python
from datasets import load_dataset
samromur_asr = load_dataset("language-and-voice-lab/samromur_asr")
```
To load an specific split (for example, the validation split) do:
```python
from datasets import load_dataset
samromur_asr = load_dataset("language-and-voice-lab/samromur_asr",split="validation")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The audio is in Icelandic.
The reading prompts were gathered from a variety of sources, mainly from the [Icelandic Gigaword Corpus](http://clarin.is/en/resources/gigaword). The corpus includes text from novels, news, plays, and from a list of location names in Iceland. The prompts also came from the [Icelandic Web of Science](https://www.visindavefur.is/).
## Dataset Structure
### Data Instances
```python
{
'audio_id': '009123-0150695',
'audio': {
'path': '/home/david/.cache/HuggingFace/datasets/downloads/extracted/cb428a7f1e46b058d76641ef32f36b49d28b73aea38509983170495408035a10/dev/009123/009123-0150695.flac',
'array': array([0., 0., 0., ..., 0., 0., 0.], dtype=float32),
'sampling_rate': 16000
},
'speaker_id': '009123',
'gender': 'female',
'age': '18-19',
'duration': 3.299999952316284,
'normalized_text': 'það skipti heldur engu'
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `speaker_id` (string) - id of speaker
* `gender` (string) - gender of speaker (male or female)
* `age` (string) - range of age of the speaker.
* `duration` (float32) - duration of the audio file in seconds.
* `normalized_text` (string) - normalized audio segment transcription.
### Data Splits
The corpus is split into train, validation, and test subsets with no speaker overlap. Each subset contains folders that correspond to speaker IDs, and the audio files inside use the following naming convention: {speaker_ID}-{utterance_ID}.flac. Lenghts of each portion are: train=114h/34m, test=15h51m, validation=15h16m.
To load an specific portion please see the above section "Example Usage".
## Dataset Creation
### Curation Rationale
* The recording has started in October 2019 and continues to this day (May 2021).
* This release has been authorized for release in May 2021.
* The aim is to create an open-source speech corpus to enable research and development for Icelandic Language Technology.
* The corpus contains audio recordings and a metadata file that contains the prompts the participants read.
* A Kaldi based script using this data can be found on the Language and Voice Lab gitHub page https://github.com/cadia-lvl/samromur-asr
### Source Data
#### Initial Data Collection and Normalization
* The utterances were recorded by a smartphone or the web app.
* The data was collected using the website https://samromur.is, code of which is available at https://github.com/cadia-lvl/samromur.
* Each recording contains one read sentence from a script.
* The script contains 85.080 unique sentences and 90.838 unique tokens.
### Annotations
#### Annotation process
Prompts were pulled from these corpora if they met the criteria of having only letters which are present in the Icelandic alphabet, and if they are listed in the [DIM: Database Icelandic Morphology](https://aclanthology.org/W19-6116.pdf).
There are also synthesised prompts consisting of a name followed by a question or a demand, in order to simulate a dialogue with a smart-device.
#### Who are the annotators?
The audio files content was manually verified against the prompts by one or more listener (summer students mainly).
### Personal and Sensitive Information
The dataset consists of people who have donated their voice. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This contribution describes an ongoing project of speech data collection, using the web application Samrómur which is built upon Common Voice, Mozilla Foundation's web platform for open-source voice collection. The goal of the project is to build a large-scale speech corpus for Automatic Speech Recognition (ASR) for Icelandic. Upon completion, Samrómur will be the largest open speech corpus for Icelandic collected from the public domain.
### Discussion of Biases
* The participants are aged between 18 to 90, 59,782 recordings are from female speakers and 40,218 are from male, recorded by a smartphone or the web app.
* Participants self-reported their age group, gender, and the native language.
* The corpus contains 100 000 utterance from 8392 speaker, totalling 145 hours.
### Other Known Limitations
"Samromur 21.05" by the Language and Voice Laboratory (LVL) at the Reykjavik University is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License with the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
## Additional Information
### Dataset Curators
The corpus is a result of the crowd-sourcing effort run by the Language and Voice Lab at the Reykjavik University, in cooperation with Almannarómur, Center for Language Technology.
### Licensing Information
[CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@inproceedings{mollberg-etal-2020-samromur,
title = "{S}amr{\'o}mur: Crowd-sourcing Data Collection for {I}celandic Speech Recognition",
author = "Mollberg, David Erik and
J{\'o}nsson, {\'O}lafur Helgi and
{\TH}orsteinsd{\'o}ttir, Sunneva and
Steingr{\'\i}msson, Stein{\th}{\'o}r and
Magn{\'u}sd{\'o}ttir, Eyd{\'\i}s Huld and
Gudnason, Jon",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.425",
pages = "3463--3467",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
### Contributions
This project was funded by the Language Technology Programme for Icelandic 2019-2023. The programme, which is managed and coordinated by Almannarómur, is funded by the Icelandic Ministry of Education, Science and Culture.
The verification for the dataset was funded by the the Icelandic Directorate of Labour's Student Summer Job Program.
Special thanks for the summer students for all the hard work.
|
open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo | ---
pretty_name: Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304823287754658,\n\
\ \"acc_stderr\": 0.03239962883986832,\n \"acc_norm\": 0.6345924216801483,\n\
\ \"acc_norm_stderr\": 0.033044077680253386,\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n\
\ \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n\
\ \"acc_stderr\": 0.004676898861978916,\n \"acc_norm\": 0.8580959968133838,\n\
\ \"acc_norm_stderr\": 0.003482384956632782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431353,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431353\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n\
\ \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n\
\ \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n\
\ \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4488248673237301,\n \
\ \"acc_stderr\": 0.01370015744278808\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- '**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet'
- config_name: results
data_files:
- split: 2024_01_14T01_56_15.562894
path:
- results_2024-01-14T01-56-15.562894.parquet
- split: latest
path:
- results_2024-01-14T01-56-15.562894.parquet
---
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304823287754658,
"acc_stderr": 0.03239962883986832,
"acc_norm": 0.6345924216801483,
"acc_norm_stderr": 0.033044077680253386,
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.004676898861978916,
"acc_norm": 0.8580959968133838,
"acc_norm_stderr": 0.003482384956632782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431353,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.4488248673237301,
"acc_stderr": 0.01370015744278808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DmitrMakeev/train_dreambooth_lora_sdxl | ---
license: openrail
---
|
GHOFRANEE/LLM_DATASET | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 548281
num_examples: 60
- name: validation
num_bytes: 198365
num_examples: 20
download_size: 318586
dataset_size: 746646
---
# Dataset Card for "LLM_DATASET"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5309054.497520661
num_examples: 3365
- name: validation
num_bytes: 409972
num_examples: 300
download_size: 1473090
dataset_size: 5719026.497520661
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-xsum-8dc1621c-12925732 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-cnn-12-6
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-cnn-12-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
Aoschu/donut_model_data_for_german_invoice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 12829172.0
num_examples: 97
- name: validation
num_bytes: 2062396.0
num_examples: 14
- name: test
num_bytes: 2719786.0
num_examples: 18
download_size: 13266362
dataset_size: 17611354.0
---
# Dataset Card for "donut_model_data_for_german_invoice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Iftitahu/javanese_instruct_stories | ---
license: cc-by-4.0
task_categories:
- translation
- text-generation
- text2text-generation
language:
- id
- jv
- su
- en
size_categories:
- n<1K
---
A dataset of parallel translation-based instructions for Javanese language as a target language. </br>
Materials are taken from randomly selected children stories at https://storyweaver.org.in, under CC-By-SA-4.0 license. </br>
The template IDs are:</br>
(1, 'Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Inggris dadi teks crito ing Basa Jawa:', 'Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:'),</br>
(2, 'Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Indonesia dadi teks crito ing Basa Jawa:', 'Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:'),</br>
(3, 'Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Sunda dadi teks crito ing Basa Jawa:', 'Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:'),</br></br>
Data is mainly composed of three parallel language samples as prompt inputs and target completions:</br>
1. <b>en_javanese</b></br>
Prompt/instruction language: <i>Javanese</i></br>
Source/input language:<i>English</i></br>
Target/output language:<i>Javanese</i></br>
Size: 402 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Inggris dadi teks crito ing Basa Jawa:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:\n\n{output}</br>
2. <b>id_javanese</b></br>
Prompt/instruction language: <i>Javanese</i></br>
Source/input language: <i>Indonesia</i></br>
Target/output language:<i>Javanese</i></br>
Size: 407 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Indonesia dadi teks crito ing Basa Jawa:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:\n\n{output}</br>
3. <b>sunda_javanese</b></br>
Prompt/instruction language: <i>Javanese</i></br>
Source/input language: <i>Sundanese</i></br>
Target/output language: <i>Javanese</i></br>
Size: 20 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Terjemahno penggalan teks crito ing ngisor iki saka Bahasa Sunda dadi teks crito ing Basa Jawa:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Terjemahane utawa padanan teks crito kasebut ing Basa Jawa yaiku:\n\n{output}</br>
Data was originally prepared for enriching multilingual resources in Open Science AYA Project (2023). |
AyoubChLin/test_20News_AgNews_CnnNews | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 41896639
num_examples: 19850
download_size: 24928143
dataset_size: 41896639
---
|
Sowmya15/profanity_2 | ---
license: apache-2.0
---
|
KhalfounMehdi/cours_medecine | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4918302
num_examples: 313
download_size: 2424246
dataset_size: 4918302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cours_medecine"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/openhermes-dev__kaist-ai_prometheus-13b-v1.0__1707405480 | ---
dataset_info:
features:
- name: model
dtype: 'null'
- name: category
dtype: string
- name: language
dtype: string
- name: custom_instruction
dtype: bool
- name: id
dtype: string
- name: topic
dtype: string
- name: avatarUrl
dtype: 'null'
- name: idx
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: string
- name: source
dtype: string
- name: model_name
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: string
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: prompt
dtype: string
- name: token_length
dtype: int64
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1_policy
dtype: string
- name: llm_as_a_judge_prompt
dtype: string
- name: completion
dtype: string
- name: candidate0_score
dtype: float64
- name: candidate1_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 3080868
num_examples: 167
download_size: 1688047
dataset_size: 3080868
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
|
SilpaCS/Alzheimer | ---
task_categories:
- image-classification
language:
- en
size_categories:
- 1K<n<10K
--- |
Falah/architecture_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 313206
num_examples: 1000
download_size: 42117
dataset_size: 313206
---
# Dataset Card for "architecture_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adalib/beatnum-cond-gen | ---
dataset_info:
features:
- name: code
dtype: string
splits:
- name: train
num_bytes: 5947006815
num_examples: 600969
download_size: 2174125889
dataset_size: 5947006815
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/zaizen_tokiko_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zaizen_tokiko/財前時子 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of zaizen_tokiko/財前時子 (THE iDOLM@STER: Cinderella Girls), containing 197 images and their tags.
The core tags of this character are `long_hair, brown_eyes, brown_hair, breasts, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 197 | 201.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zaizen_tokiko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 197 | 139.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zaizen_tokiko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 395 | 248.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zaizen_tokiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 197 | 185.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zaizen_tokiko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 395 | 315.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zaizen_tokiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zaizen_tokiko_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, large_breasts, simple_background, white_background, earrings, handbag, necklace, skirt, smile, yellow_eyes, bracelet, dated, dress, holding, open_mouth, shirt, signature, upper_body |
| 1 | 11 |  |  |  |  |  | 1girl, hair_ornament, solo, cleavage, navel, thighhighs, whip, garter_straps, black_gloves, flower, looking_at_viewer, smile, bare_shoulders, earrings, elbow_gloves, open_mouth |
| 2 | 7 |  |  |  |  |  | 1girl, solo, smile, card_(medium), character_name, sun_symbol, necklace, orange_background, skirt, sparkle |
| 3 | 5 |  |  |  |  |  | 1girl, necklace, sitting, barefoot, holding_phone, smartphone, soles, toes, 1boy, cleavage, femdom, foot_focus, foreshortening, hetero, penis, shoes_removed, toenail_polish, clothed_female_nude_male, english_text, erection, footjob, handjob, high_heels, indoors, jacket, medium_breasts, on_back, on_bed, parted_lips, solo, sweat, uncensored |
| 4 | 9 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blue_serafuku, pleated_skirt, blue_skirt, full_body, hair_bow, open_mouth, shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | large_breasts | simple_background | white_background | earrings | handbag | necklace | skirt | smile | yellow_eyes | bracelet | dated | dress | holding | open_mouth | shirt | signature | upper_body | hair_ornament | cleavage | navel | thighhighs | whip | garter_straps | black_gloves | flower | bare_shoulders | elbow_gloves | card_(medium) | character_name | sun_symbol | orange_background | sparkle | sitting | barefoot | holding_phone | smartphone | soles | toes | 1boy | femdom | foot_focus | foreshortening | hetero | penis | shoes_removed | toenail_polish | clothed_female_nude_male | english_text | erection | footjob | handjob | high_heels | indoors | jacket | medium_breasts | on_back | on_bed | parted_lips | sweat | uncensored | blue_serafuku | pleated_skirt | blue_skirt | full_body | hair_bow | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------------|:--------------------|:-------------------|:-----------|:----------|:-----------|:--------|:--------|:--------------|:-----------|:--------|:--------|:----------|:-------------|:--------|:------------|:-------------|:----------------|:-----------|:--------|:-------------|:-------|:----------------|:---------------|:---------|:-----------------|:---------------|:----------------|:-----------------|:-------------|:--------------------|:----------|:----------|:-----------|:----------------|:-------------|:--------|:-------|:-------|:---------|:-------------|:-----------------|:---------|:--------|:----------------|:-----------------|:---------------------------|:---------------|:-----------|:----------|:----------|:-------------|:----------|:---------|:-----------------|:----------|:---------|:--------------|:--------|:-------------|:----------------|:----------------|:-------------|:------------|:-----------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | | | X | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.