datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
newsmediabias/DB4Good | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
- question-answering
- text-generation
- text2text-generation
language:
- en
pretty_name: DB4Good
size_categories:
- 10K<n<100K
configs:
- config_name: 1-Bias-Classification
data_files:
- split: classification
path: "1-Bias-Classification/train.csv"
- split: multi_label_classification
path: "1-Bias-Classification/multi-label classification.csv"
- split: train
path: "1-Bias-Classification/classification.csv"
- config_name: 2-Bias-Categorization
data_files:
- split: bias_aspects
path: "2-Bias-Categorization/aspects.csv"
- config_name: 3-Bias-Extraction
data_files:
- split: bias_tokens
path: "3-Bias-Extraction/Bias_tokens.csv"
- split: bias_tokens_in_CONLL
path: "3-Bias-Extraction/conll.csv"
- config_name: 4-Bias-Targetted-Demographics
data_files:
- split: demographics_data
path: "4-Bias-Targetted-Demographics/demo-train.csv"
- split: demographics_test
path: "4-Bias-Targetted-Demographics/demographics.csv"
- config_name: 5-Fairness-Evaluation
data_files:
- split: bias_detection_counterfactuals
path: "5-Fairness-Evaluation/Bias-Detection-Counterfactuals.csv"
- config_name: 6-Stereotypes
data_files:
- split: stereotype_prompts
path: "6-Stereotypes/stereotype_prompts.csv"
- config_name: 7-Benign-generation
data_files:
- split: Benign_texts
path: "7-Benign-Generation/bias-debias.csv"
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_abideen__MonarchCoder-7B | ---
pretty_name: Evaluation run of abideen/MonarchCoder-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/MonarchCoder-7B](https://huggingface.co/abideen/MonarchCoder-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__MonarchCoder-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T21:08:48.555243](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__MonarchCoder-7B/blob/main/results_2024-02-22T21-08-48.555243.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498100799858585,\n\
\ \"acc_stderr\": 0.03207331515564925,\n \"acc_norm\": 0.6509932629551645,\n\
\ \"acc_norm_stderr\": 0.03271930542505799,\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6120799821862185,\n\
\ \"mc2_stderr\": 0.015360664269682777\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094092,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6898028281218881,\n\
\ \"acc_stderr\": 0.004616288245259755,\n \"acc_norm\": 0.8730332603067118,\n\
\ \"acc_norm_stderr\": 0.0033225528296089053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931038,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931038\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867454,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867454\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6120799821862185,\n\
\ \"mc2_stderr\": 0.015360664269682777\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487059\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6512509476876421,\n \
\ \"acc_stderr\": 0.013127227055035861\n }\n}\n```"
repo_url: https://huggingface.co/abideen/MonarchCoder-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|arc:challenge|25_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|gsm8k|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hellaswag|10_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T21-08-48.555243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T21-08-48.555243.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- '**/details_harness|winogrande|5_2024-02-22T21-08-48.555243.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T21-08-48.555243.parquet'
- config_name: results
data_files:
- split: 2024_02_22T21_08_48.555243
path:
- results_2024-02-22T21-08-48.555243.parquet
- split: latest
path:
- results_2024-02-22T21-08-48.555243.parquet
---
# Dataset Card for Evaluation run of abideen/MonarchCoder-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/MonarchCoder-7B](https://huggingface.co/abideen/MonarchCoder-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__MonarchCoder-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T21:08:48.555243](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__MonarchCoder-7B/blob/main/results_2024-02-22T21-08-48.555243.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6498100799858585,
"acc_stderr": 0.03207331515564925,
"acc_norm": 0.6509932629551645,
"acc_norm_stderr": 0.03271930542505799,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6120799821862185,
"mc2_stderr": 0.015360664269682777
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094092,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6898028281218881,
"acc_stderr": 0.004616288245259755,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.0033225528296089053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931038,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931038
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867454,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867454
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355295,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6120799821862185,
"mc2_stderr": 0.015360664269682777
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487059
},
"harness|gsm8k|5": {
"acc": 0.6512509476876421,
"acc_stderr": 0.013127227055035861
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
a6kme/minds14-mirror | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
- fr
- it
- es
- pt
- de
- nl
- ru
- pl
- cs
- ko
- zh
language_bcp47:
- en
- en-GB
- en-US
- en-AU
- fr
- it
- es
- pt
- de
- nl
- ru
- pl
- cs
- ko
- zh
license:
- cc-by-4.0
multilinguality:
- multilingual
pretty_name: 'MInDS-14'
size_categories:
- 10K<n<100K
task_categories:
- automatic-speech-recognition
- speech-processing
task_ids:
- speech-recognition
- keyword-spotting
---
# MInDS-14
## Dataset Description
- **Fine-Tuning script:** [pytorch/audio-classification](https://github.com/huggingface/transformers/tree/main/examples/pytorch/audio-classification)
- **Paper:** [Multilingual and Cross-Lingual Intent Detection from Spoken Data](https://arxiv.org/abs/2104.08524)
- **Total amount of disk used:** ca. 500 MB
MINDS-14 is training and evaluation resource for intent detection task with spoken data. It covers 14
intents extracted from a commercial system in the e-banking domain, associated with spoken examples in 14 diverse language varieties.
## Example
MInDS-14 can be downloaded and used as follows:
```py
from datasets import load_dataset
minds_14 = load_dataset("PolyAI/minds14", "fr-FR") # for French
# to download all data for multi-lingual fine-tuning uncomment following line
# minds_14 = load_dataset("PolyAI/all", "all")
# see structure
print(minds_14)
# load audio sample on the fly
audio_input = minds_14["train"][0]["audio"] # first decoded audio sample
intent_class = minds_14["train"][0]["intent_class"] # first transcription
intent = minds_14["train"].features["intent_class"].names[intent_class]
# use audio_input and language_class to fine-tune your model for audio classification
```
## Dataset Structure
We show detailed information the example configurations `fr-FR` of the dataset.
All other configurations have the same structure.
### Data Instances
**fr-FR**
- Size of downloaded dataset files: 471 MB
- Size of the generated dataset: 300 KB
- Total amount of disk used: 471 MB
An example of a datainstance of the config `fr-FR` looks as follows:
```
{
"path": "/home/patrick/.cache/huggingface/datasets/downloads/extracted/3ebe2265b2f102203be5e64fa8e533e0c6742e72268772c8ac1834c5a1a921e3/fr-FR~ADDRESS/response_4.wav",
"audio": {
"path": "/home/patrick/.cache/huggingface/datasets/downloads/extracted/3ebe2265b2f102203be5e64fa8e533e0c6742e72268772c8ac1834c5a1a921e3/fr-FR~ADDRESS/response_4.wav",
"array": array(
[0.0, 0.0, 0.0, ..., 0.0, 0.00048828, -0.00024414], dtype=float32
),
"sampling_rate": 8000,
},
"transcription": "je souhaite changer mon adresse",
"english_transcription": "I want to change my address",
"intent_class": 1,
"lang_id": 6,
}
```
### Data Fields
The data fields are the same among all splits.
- **path** (str): Path to the audio file
- **audio** (dict): Audio object including loaded audio array, sampling rate and path ot audio
- **transcription** (str): Transcription of the audio file
- **english_transcription** (str): English transcription of the audio file
- **intent_class** (int): Class id of intent
- **lang_id** (int): Id of language
### Data Splits
Every config only has the `"train"` split containing of *ca.* 600 examples.
## Dataset Creation
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
All datasets are licensed under the [Creative Commons license (CC-BY)](https://creativecommons.org/licenses/).
### Citation Information
```
@article{DBLP:journals/corr/abs-2104-08524,
author = {Daniela Gerz and
Pei{-}Hao Su and
Razvan Kusztos and
Avishek Mondal and
Michal Lis and
Eshan Singhal and
Nikola Mrksic and
Tsung{-}Hsien Wen and
Ivan Vulic},
title = {Multilingual and Cross-Lingual Intent Detection from Spoken Data},
journal = {CoRR},
volume = {abs/2104.08524},
year = {2021},
url = {https://arxiv.org/abs/2104.08524},
eprinttype = {arXiv},
eprint = {2104.08524},
timestamp = {Mon, 26 Apr 2021 17:25:10 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2104-08524.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset
|
CJWeiss/LexGenZero_eurlexsum | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: input
dtype: string
- name: output
dtype: string
- name: fk_grade
dtype: float64
- name: cluster
dtype: string
- name: old_id
dtype: int64
splits:
- name: train
num_bytes: 21274111
num_examples: 50
download_size: 7856696
dataset_size: 21274111
---
# Dataset Card for "LexGenZero_eurlexsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1 | ---
pretty_name: Evaluation run of prince-canuma/Damysus-Coder-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [prince-canuma/Damysus-Coder-v0.1](https://huggingface.co/prince-canuma/Damysus-Coder-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T21:07:57.637768](https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1/blob/main/results_2024-04-15T21-07-57.637768.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6043961155404135,\n\
\ \"acc_stderr\": 0.033186632585662346,\n \"acc_norm\": 0.6094482265591639,\n\
\ \"acc_norm_stderr\": 0.033859874632835685,\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6419919334749323,\n\
\ \"mc2_stderr\": 0.01518622081933932\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.01452122640562708,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.64070902210715,\n \
\ \"acc_stderr\": 0.004788120727316245,\n \"acc_norm\": 0.840071698864768,\n\
\ \"acc_norm_stderr\": 0.0036579044379436557\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335842,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335842\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.015839400406212494,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.015839400406212494\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354025,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6419919334749323,\n\
\ \"mc2_stderr\": 0.01518622081933932\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025395\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39272175890826383,\n \
\ \"acc_stderr\": 0.01345174534958657\n }\n}\n```"
repo_url: https://huggingface.co/prince-canuma/Damysus-Coder-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-07-57.637768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-07-57.637768.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- '**/details_harness|winogrande|5_2024-04-15T21-07-57.637768.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T21-07-57.637768.parquet'
- config_name: results
data_files:
- split: 2024_04_15T21_07_57.637768
path:
- results_2024-04-15T21-07-57.637768.parquet
- split: latest
path:
- results_2024-04-15T21-07-57.637768.parquet
---
# Dataset Card for Evaluation run of prince-canuma/Damysus-Coder-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [prince-canuma/Damysus-Coder-v0.1](https://huggingface.co/prince-canuma/Damysus-Coder-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T21:07:57.637768](https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1/blob/main/results_2024-04-15T21-07-57.637768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6043961155404135,
"acc_stderr": 0.033186632585662346,
"acc_norm": 0.6094482265591639,
"acc_norm_stderr": 0.033859874632835685,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6419919334749323,
"mc2_stderr": 0.01518622081933932
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.01452122640562708,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513782
},
"harness|hellaswag|10": {
"acc": 0.64070902210715,
"acc_stderr": 0.004788120727316245,
"acc_norm": 0.840071698864768,
"acc_norm_stderr": 0.0036579044379436557
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686858,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686858
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335842,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335842
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647886,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212494,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212494
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354025,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6419919334749323,
"mc2_stderr": 0.01518622081933932
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025395
},
"harness|gsm8k|5": {
"acc": 0.39272175890826383,
"acc_stderr": 0.01345174534958657
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
julianrisch/qa-dataset-original-21020 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 18661234
num_examples: 21020
download_size: 11708980
dataset_size: 18661234
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qa-dataset-original-21020"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
notoriousdto/synthetic-elisp-alpha-0.1 | ---
license: mit
---
This dataset is a work in progress. It will be used to train execution of a subset of Emacs Lisp within the LLM according to the techniques described in this paper: https://arxiv.org/abs/2305.05383 |
dim/habr_10k | ---
dataset_info:
features:
- name: id
dtype: uint32
- name: language
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text_markdown
dtype: string
- name: text_html
dtype: string
- name: author
dtype: string
- name: original_author
dtype: string
- name: original_url
dtype: string
- name: lead_html
dtype: string
- name: lead_markdown
dtype: string
- name: type
dtype: string
- name: time_published
dtype: uint64
- name: statistics
struct:
- name: commentsCount
dtype: uint32
- name: favoritesCount
dtype: uint32
- name: readingCount
dtype: uint32
- name: score
dtype: int32
- name: votesCount
dtype: int32
- name: votesCountPlus
dtype: int32
- name: votesCountMinus
dtype: int32
- name: labels
sequence: string
- name: hubs
sequence: string
- name: flows
sequence: string
- name: tags
sequence: string
- name: reading_time
dtype: uint32
- name: format
dtype: string
- name: complexity
dtype: string
- name: comments
sequence:
- name: id
dtype: uint64
- name: parent_id
dtype: uint64
- name: level
dtype: uint32
- name: time_published
dtype: uint64
- name: score
dtype: int32
- name: votes
dtype: uint32
- name: message_html
dtype: string
- name: message_markdown
dtype: string
- name: author
dtype: string
- name: children
sequence: uint64
- name: readingCount
dtype: int64
splits:
- name: train
num_bytes: 661170132.0315578
num_examples: 10000
download_size: 901387901
dataset_size: 661170132.0315578
---
# Dataset Card for "habr_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cheafdevo56/HighlyInfluentialTriplets | ---
license: apache-2.0
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 73286160.22355364
num_examples: 19227
- name: validation
num_bytes: 8145447.776446358
num_examples: 2137
download_size: 48594525
dataset_size: 81431608.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_93 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 101753230
num_examples: 10399
download_size: 29719895
dataset_size: 101753230
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_93"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dolo650/alpaca_10000 | ---
license: apache-2.0
---
|
Pawamami/Pm | ---
license: apache-2.0
task_categories:
- text-generation
language:
- fr
- en
- ha
- ar
tags:
- music
pretty_name: PawaGTPs
size_categories:
- 10M<n<100M
--- |
yiyic/beir | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: arguana
num_bytes: 2786456
num_examples: 2807
- name: climate_fever
num_bytes: 2516703
num_examples: 2877
- name: dbpedia_entity
num_bytes: 13982112
num_examples: 41124
- name: fiqa
num_bytes: 1824949
num_examples: 2353
- name: msmarco
num_bytes: 3153901
num_examples: 9182
- name: nfcorpus
num_bytes: 4689125
num_examples: 3451
- name: nq
num_bytes: 2727274
num_examples: 7653
- name: quora
num_bytes: 1442109
num_examples: 25675
- name: scidocs
num_bytes: 29269039
num_examples: 26313
- name: scifact
num_bytes: 458045
num_examples: 583
- name: trec_covid
num_bytes: 42655975
num_examples: 30012
- name: webis_touche2020
num_bytes: 5610372
num_examples: 2148
download_size: 65542954
dataset_size: 111116060
configs:
- config_name: default
data_files:
- split: arguana
path: data/arguana-*
- split: climate_fever
path: data/climate_fever-*
- split: dbpedia_entity
path: data/dbpedia_entity-*
- split: fiqa
path: data/fiqa-*
- split: msmarco
path: data/msmarco-*
- split: nfcorpus
path: data/nfcorpus-*
- split: nq
path: data/nq-*
- split: quora
path: data/quora-*
- split: scidocs
path: data/scidocs-*
- split: scifact
path: data/scifact-*
- split: trec_covid
path: data/trec_covid-*
- split: webis_touche2020
path: data/webis_touche2020-*
---
|
armvectores/hyw_wikipedia_2023 | ---
task_categories:
- text-generation
language:
- hyw
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 55910963
num_examples: 10785
download_size: 26613923
dataset_size: 55910963
tags:
- wikipedia
- western armenian
size_categories:
- 1M<n<10M
---
Western armenian wikipedia 04.2023
4M tokens
10.785 articles |
edbeeching/prj_gia_dataset_mujoco_walker_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the mujoco_walker environment, sample for the policy mujoco_walker_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 141394
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 141394
num_examples: 1000
download_size: 106130
dataset_size: 282788
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712977874 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9470
num_examples: 21
download_size: 9500
dataset_size: 9470
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712977874"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
k0ntra/tonymontana | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
splits:
- name: train
num_bytes: 1536
num_examples: 1
download_size: 161246
dataset_size: 1536
---
# Dataset Card for "tonymontana"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shengtao/recipe | ---
license: mit
---
|
odepraz/rvl_cdip_1percentofdata | ---
license: unknown
---
|
asapp/slue-phase-2 | ---
dataset_info:
- config_name: hvb
features:
- name: issue_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker_id
dtype: string
- name: text
dtype: string
- name: utt_index
dtype: int32
- name: channel
dtype: int32
- name: role
dtype: string
- name: start_ms
dtype: int32
- name: duration_ms
dtype: int32
- name: intent
dtype: string
- name: dialog_acts
sequence: string
splits:
- name: train
num_bytes: 803631533.648
num_examples: 11344
- name: validation
num_bytes: 115999281.63
num_examples: 1690
- name: test
num_bytes: 413280185.739
num_examples: 6121
download_size: 1287263357
dataset_size: 1332911001.017
- config_name: sqa5
features:
- name: question_id
dtype: string
- name: question_audio
dtype:
audio:
sampling_rate: 16000
- name: question_speaker_id
dtype: string
- name: raw_question_text
dtype: string
- name: normalized_question_text
dtype: string
- name: document_id
dtype: string
- name: document_audio
dtype:
audio:
sampling_rate: 16000
- name: document_speaker_id
dtype: string
- name: raw_document_text
dtype: string
- name: normalized_document_text
dtype: string
- name: word2time
sequence:
- name: word
dtype: string
- name: normalized_word
dtype: string
- name: start_second
dtype: float64
- name: end_second
dtype: float64
- name: answer_spans
sequence:
- name: answer
dtype: string
- name: start_second
dtype: float64
- name: end_second
dtype: float64
splits:
- name: train
num_bytes: 134775904845.04
num_examples: 46186
- name: validation
num_bytes: 5686714785.843
num_examples: 1939
- name: test
num_bytes: 6967375359.628
num_examples: 2382
- name: verified_test
num_bytes: 1182628989.0
num_examples: 408
download_size: 118074473123
dataset_size: 148612623979.511
- config_name: ted
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker
dtype: string
- name: transcript
dtype: string
- name: title
dtype: string
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 46573026086.984
num_examples: 3384
- name: validation
num_bytes: 5694199931.0
num_examples: 425
- name: test
num_bytes: 5959094411.0
num_examples: 423
download_size: 58384489268
dataset_size: 58226320428.984
- config_name: vp_nel
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker_id
dtype: string
- name: text
dtype: string
- name: word_timestamps
sequence:
- name: word
dtype: string
- name: start_sec
dtype: float64
- name: end_sec
dtype: float64
- name: ne_timestamps
sequence:
- name: ne_label
dtype: string
- name: start_char_idx
dtype: int32
- name: char_offset
dtype: int32
- name: start_sec
dtype: float64
- name: end_sec
dtype: float64
splits:
- name: validation
num_bytes: 83371882.75
num_examples: 1750
- name: test
num_bytes: 85222143.142
num_examples: 1838
download_size: 165119242
dataset_size: 168594025.89200002
configs:
- config_name: hvb
data_files:
- split: train
path: hvb/train-*
- split: validation
path: hvb/validation-*
- split: test
path: hvb/test-*
- config_name: sqa5
data_files:
- split: train
path: sqa5/train-*
- split: validation
path: sqa5/validation-*
- split: test
path: sqa5/test-*
- split: verified_test
path: sqa5/verified_test-*
- config_name: ted
data_files:
- split: train
path: ted/train-*
- split: validation
path: ted/validation-*
- split: test
path: ted/test-*
- config_name: vp_nel
data_files:
- split: validation
path: vp_nel/validation-*
- split: test
path: vp_nel/test-*
---
### Dataset description
- **(Jan. 8 2024) Test set labels are released**
- **Toolkit Repository:** [https://github.com/asappresearch/slue-toolkit/](https://github.com/asappresearch/slue-toolkit/)
- **Paper:** [https://arxiv.org/abs/2212.10525](https://arxiv.org/abs/2212.10525)
### Licensing Information
#### SLUE-HVB
SLUE-HVB dataset contains a subset of the Gridspace-Stanford Harper Valley speech dataset and the copyright of this subset remains the same with the original license, CC-BY-4.0. See also original license notice (https://github.com/cricketclub/gridspace-stanford-harper-valley/blob/master/LICENSE)
Additionally, we provide dialog act classification annotation and it is covered with the same license as CC-BY-4.0.
#### SLUE-SQA-5
SLUE-SQA-5 Dataset contains question texts and answer strings (question_text, normalized_question_text, and answer_spans column in .tsv files) from these datasets,
* SQuAD1.1 (for questions whose question_id starts with ‘squad-’)
* Natural Questions (for questions whose question_id starts with ‘nq-’)
* WebQuestions (for questions whose question_id starts with ‘wq-’)
* CuratedTREC (for questions whose question_id starts with ‘trec-’)
* TriviaQA (for questions whose question_id starts with ‘triviaqa-’)
Additionally, we provide audio recordings (.wav files in “question” directories) of these questions.
For questions from TriviaQA (questions whose question_id starts with ‘triviaqa-’), their question texts, answer strings, and audio recordings are licensed with the same Apache License 2.0 as TriviaQA (for more detail, please refer to https://github.com/mandarjoshi90/triviaqa/blob/master/LICENSE).
For questions from the other 4 datasets, their question texts, answer strings, and audio recordings are licensed with Creative Commons Attribution-ShareAlike 4.0 International license.
SLUE-SQA-5 also contains a subset of Spoken Wikipedia, including the audios placed in “document” directories and their transcripts (document_text and normalized_document_text column in .tsv files). Additionally, we provide the text-to-speech alignments (.txt files in “word2time” directories).These contents are licensed with the same Creative Commons (CC BY-SA 4.0) license as Spoken Wikipedia.
#### SLUE-TED
SLUE-TED Dataset contains TED Talk audios along with the associated abstracts and title, which were concatenated to create reference summaries. This corpus is licensed with the same Creative Commons (CC BY–NC–ND 4.0 International) license as TED talks. For further information, please refer to the details provided below.
=============================
TED.com
We encourage you to share TED Talks, under our Creative Commons license, or ( CC BY–NC–ND 4.0 International, which means it may be shared under the conditions below:
CC: means the type of license rights associated with TED Talks, or Creative Commons
BY: means the requirement to include an attribution to TED as the owner of the TED Talk and include a link to the talk, but do not include any other TED branding on your website or platform, or language that may imply an endorsement.
NC: means you cannot use TED Talks in any commercial context or to gain any type of revenue, payment or fee from the license sublicense, access or usage of TED Talks in an app of any kind for any advertising, or in exchange for payment of any kind, including in any ad supported content or format.
ND: means that no derivative works are permitted so you cannot edit, remix, create, modify or alter the form of the TED Talks in any way. This includes using the TED Talks as the basis for another work, including dubbing, voice-overs, or other translations not authorized by TED. You may not add any more restrictions that we have placed on the TED site content, such as additional legal or technological restrictions on accessing the content.
|
DrakuTheDragon/Wiki_de | ---
language:
- de
--- |
CoreloneH/coco | ---
license: mit
---
|
jcrisch/fuzi-characters | ---
language:
- en
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 55941193.0
num_examples: 251
download_size: 41251195
dataset_size: 55941193.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Dataset for Fuzi characters.
Color: blue
No combined characters |
karmiq/wikipedia-embeddings-cs-minilm | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: chunks
sequence: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 3302394852
num_examples: 534044
download_size: 3029969220
dataset_size: 3302394852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- cs
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- fill-mask
license:
- cc-by-sa-3.0
- gfdl
---
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2`](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-e5-base", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos )...',
'Myšlenky Aristotelovy rozvinul ve 2. století našeho letopočtu Klaudios Ptolemaios...',
...,
],
'embeddings': [
[0.09006806463003159, -0.009814552962779999, ...],
[0.10767366737127304, ...],
...
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: "
f"{info['index']['error']['type']}: {info['index']['error']['caused_by']['type']}: "
f"{info['index']['error']['caused_by']['reason'][:250]}")
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import sentence_transformers
model = sentence_transformers.SentenceTransformer("sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2")
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.90] Fyzika částic ( též částicová fyzika ) je oblast fyziky, která se zabývá částicemi. V širším smyslu… [Fyzika částic]
# [0.89] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 15 minutes on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
|
haris001/deepseek_dataset | ---
license: mit
---
|
jlbaker361/flickr_humans_0.5k_scream | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 226369823.0
num_examples: 500
download_size: 226373537
dataset_size: 226369823.0
---
# Dataset Card for "flickr_humans_0.5k_scream"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mutiyu20/yu_nagaba | ---
license: artistic-2.0
---
|
diffusers-parti-prompts/sdxl-1.0 | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
- name: seed
dtype: int64
splits:
- name: train
num_bytes: 189820015.232
num_examples: 1632
download_size: 189815139
dataset_size: 189820015.232
---
# Dataset Card for "sdxl-1.0"
Dataset was generated using the code below:
```python
import torch
from datasets import Dataset, Features
from datasets import Image as ImageFeature
from datasets import Value, load_dataset
from diffusers import DDIMScheduler, DiffusionPipeline
import PIL
def main():
print("Loading dataset...")
parti_prompts = load_dataset("nateraw/parti-prompts", split="train")
print("Loading pipeline...")
ckpt_id = "stabilityai/stable-diffusion-xl-base-1.0"
pipe = DiffusionPipeline.from_pretrained(
ckpt_id, torch_dtype=torch.float16, use_auth_token=True
).to("cuda")
pipe.scheduler = DDIMScheduler.from_config(pipe.scheduler.config)
pipe.set_progress_bar_config(disable=True)
seed = 0
generator = torch.Generator("cuda").manual_seed(seed)
print("Running inference...")
main_dict = {}
for i in range(len(parti_prompts)):
sample = parti_prompts[i]
prompt = sample["Prompt"]
image = pipe(
prompt,
generator=generator,
num_inference_steps=100,
guidance_scale=7.5,
).images[0]
image = image.resize((256, 256), resample=PIL.Image.Resampling.LANCZOS)
img_path = f"sd_xl_{i}.png"
image.save(img_path)
main_dict.update(
{
prompt: {
"img_path": img_path,
"Category": sample["Category"],
"Challenge": sample["Challenge"],
"Note": sample["Note"],
"model_name": ckpt_id,
"seed": seed,
}
}
)
def generation_fn():
for prompt in main_dict:
prompt_entry = main_dict[prompt]
yield {
"Prompt": prompt,
"Category": prompt_entry["Category"],
"Challenge": prompt_entry["Challenge"],
"Note": prompt_entry["Note"],
"images": {"path": prompt_entry["img_path"]},
"model_name": prompt_entry["model_name"],
"seed": prompt_entry["seed"],
}
print("Preparing HF dataset...")
ds = Dataset.from_generator(
generation_fn,
features=Features(
Prompt=Value("string"),
Category=Value("string"),
Challenge=Value("string"),
Note=Value("string"),
images=ImageFeature(),
model_name=Value("string"),
seed=Value("int64"),
),
)
ds_id = "diffusers-parti-prompts/sdxl-1.0"
ds.push_to_hub(ds_id)
if __name__ == "__main__":
main()
|
SmartLabsData/Phone-Price-Prediction | ---
license: apache-2.0
---
|
anan-2024/twitter_dataset_1713107152 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 105388
num_examples: 279
download_size: 58083
dataset_size: 105388
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pmarmik/filtered_samvaad | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train
num_bytes: 325833992.9441444
num_examples: 68000
- name: validation
num_bytes: 45520925.48484371
num_examples: 9500
- name: test
num_bytes: 23958381.834128268
num_examples: 5000
download_size: 166833215
dataset_size: 395313300.2631164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Technoculture__Medtulu-4x7B | ---
pretty_name: Evaluation run of Technoculture/Medtulu-4x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/Medtulu-4x7B](https://huggingface.co/Technoculture/Medtulu-4x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medtulu-4x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T09:26:06.099420](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-4x7B/blob/main/results_2024-01-16T09-26-06.099420.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2441106685479756,\n\
\ \"acc_stderr\": 0.030388013771384576,\n \"acc_norm\": 0.24501971068706568,\n\
\ \"acc_norm_stderr\": 0.031199333244496447,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080522,\n \"mc2\": 0.47911756406040795,\n\
\ \"mc2_stderr\": 0.016890966208763153\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21928327645051193,\n \"acc_stderr\": 0.012091245787615707,\n\
\ \"acc_norm\": 0.28754266211604096,\n \"acc_norm_stderr\": 0.01322671905626613\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2559251145190201,\n\
\ \"acc_stderr\": 0.004354881005789727,\n \"acc_norm\": 0.2574188408683529,\n\
\ \"acc_norm_stderr\": 0.004363185172047182\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610645,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610645\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.02560423347089911,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.02560423347089911\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.038009680605548574,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.038009680605548574\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.037082846624165444,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.037082846624165444\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n\
\ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.02354079935872331,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.02354079935872331\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444437,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444437\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.02184086699042308,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.02184086699042308\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.02564947026588919,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.02564947026588919\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"\
acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n\
\ \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.22869955156950672,\n\
\ \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.015517322365529615,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.015517322365529615\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468645,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250885,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250885\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26010430247718386,\n\
\ \"acc_stderr\": 0.011204382887823834,\n \"acc_norm\": 0.26010430247718386,\n\
\ \"acc_norm_stderr\": 0.011204382887823834\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1746987951807229,\n\
\ \"acc_stderr\": 0.02956032621125685,\n \"acc_norm\": 0.1746987951807229,\n\
\ \"acc_norm_stderr\": 0.02956032621125685\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080522,\n \"mc2\": 0.47911756406040795,\n\
\ \"mc2_stderr\": 0.016890966208763153\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/Medtulu-4x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|arc:challenge|25_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|gsm8k|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hellaswag|10_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T09-26-06.099420.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- '**/details_harness|winogrande|5_2024-01-16T09-26-06.099420.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T09-26-06.099420.parquet'
- config_name: results
data_files:
- split: 2024_01_16T09_26_06.099420
path:
- results_2024-01-16T09-26-06.099420.parquet
- split: latest
path:
- results_2024-01-16T09-26-06.099420.parquet
---
# Dataset Card for Evaluation run of Technoculture/Medtulu-4x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-4x7B](https://huggingface.co/Technoculture/Medtulu-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medtulu-4x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T09:26:06.099420](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-4x7B/blob/main/results_2024-01-16T09-26-06.099420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2441106685479756,
"acc_stderr": 0.030388013771384576,
"acc_norm": 0.24501971068706568,
"acc_norm_stderr": 0.031199333244496447,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080522,
"mc2": 0.47911756406040795,
"mc2_stderr": 0.016890966208763153
},
"harness|arc:challenge|25": {
"acc": 0.21928327645051193,
"acc_stderr": 0.012091245787615707,
"acc_norm": 0.28754266211604096,
"acc_norm_stderr": 0.01322671905626613
},
"harness|hellaswag|10": {
"acc": 0.2559251145190201,
"acc_stderr": 0.004354881005789727,
"acc_norm": 0.2574188408683529,
"acc_norm_stderr": 0.004363185172047182
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610645,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610645
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.02560423347089911,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.02560423347089911
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548574,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548574
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.037082846624165444,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.037082846624165444
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.02354079935872331,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.02354079935872331
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444437,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444437
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.02184086699042308,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.02184086699042308
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.02564947026588919,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.02564947026588919
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529615,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529615
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468645,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250885,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250885
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26010430247718386,
"acc_stderr": 0.011204382887823834,
"acc_norm": 0.26010430247718386,
"acc_norm_stderr": 0.011204382887823834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1746987951807229,
"acc_stderr": 0.02956032621125685,
"acc_norm": 0.1746987951807229,
"acc_norm_stderr": 0.02956032621125685
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080522,
"mc2": 0.47911756406040795,
"mc2_stderr": 0.016890966208763153
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076918
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16 | ---
pretty_name: Evaluation run of bhenrym14/mistral-7b-platypus-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T09:15:23.830857](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-29T09-15-23.830857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4168414429530201,\n\
\ \"em_stderr\": 0.005049151744527279,\n \"f1\": 0.4591768036912757,\n\
\ \"f1_stderr\": 0.0048851694906548275,\n \"acc\": 0.479468014382712,\n\
\ \"acc_stderr\": 0.010986687977801515\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4168414429530201,\n \"em_stderr\": 0.005049151744527279,\n\
\ \"f1\": 0.4591768036912757,\n \"f1_stderr\": 0.0048851694906548275\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17361637604245642,\n \
\ \"acc_stderr\": 0.010433463221257632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345398\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|drop|3_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T09-15-23.830857.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-15-23.830857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|winogrande|5_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T09-15-23.830857.parquet'
- config_name: results
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- results_2023-10-09T19-22-13.143311.parquet
- split: 2023_10_29T09_15_23.830857
path:
- results_2023-10-29T09-15-23.830857.parquet
- split: latest
path:
- results_2023-10-29T09-15-23.830857.parquet
---
# Dataset Card for Evaluation run of bhenrym14/mistral-7b-platypus-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T09:15:23.830857](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-29T09-15-23.830857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4168414429530201,
"em_stderr": 0.005049151744527279,
"f1": 0.4591768036912757,
"f1_stderr": 0.0048851694906548275,
"acc": 0.479468014382712,
"acc_stderr": 0.010986687977801515
},
"harness|drop|3": {
"em": 0.4168414429530201,
"em_stderr": 0.005049151744527279,
"f1": 0.4591768036912757,
"f1_stderr": 0.0048851694906548275
},
"harness|gsm8k|5": {
"acc": 0.17361637604245642,
"acc_stderr": 0.010433463221257632
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345398
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
VALOSDEUS/KRONK | ---
license: openrail
---
|
DL3DV/DL3DV-ALL-ColmapCache | ---
tags:
- 3D Vision
- NeRF
- 3D Gaussian
- Dataset
- Novel View Synthesis
- Text to 3D
- Image to 3D
pretty_name: Dl3DV-Dataset
size_categories:
- n>1T
---
# DL3DV-Dataset
This repo has all the colmap caches for the DL3DV-10K Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.
# Download
If you have enough space, you can use git to download a dataset from huggingface. See this [link](https://huggingface.co/docs/hub/en/datasets-downloading).
If you do not have enough space, we further provide a [download script](https://github.com/DL3DV-10K/Dataset/blob/main/scripts/download.py) here to download a subset. The usage:
```Bash
usage: download.py [-h] --odir ODIR --subset {1K,2K,3K,4K,5K,6K,7K,8K,9K,10K} --resolution {4K,2K,960P,480P} --file_type {images+poses,video,colmap_cache} [--hash HASH]
[--clean_cache]
optional arguments:
-h, --help show this help message and exit
--odir ODIR output directory
--subset {1K,2K,3K,4K,5K,6K,7K,8K,9K,10K}
The subset of the benchmark to download
--resolution {4K,2K,960P,480P}
The resolution to donwnload
--file_type {images+poses,video,colmap_cache}
The file type to download
--hash HASH If set subset=hash, this is the hash code of the scene to download
--clean_cache If set, will clean the huggingface cache to save space
```
Here are some examples:
```Bash
# Make sure you have applied for the access.
# Use this to download the download.py script
wget https://raw.githubusercontent.com/DL3DV-10K/Dataset/main/scripts/download.py
# Download colmap cache for 0~1K subset, output to DL3DV-10K directory, ignore the resolution options
python download.py --odir DL3DV-10K --subset 1K --resolution 480P --file_type colmap_cache --clean_cache
```
You can also download a specific scene with its hash. The scene-hash pair visualization can be found [here](https://htmlpreview.github.io/?https://github.com/DL3DV-10K/Dataset/blob/main/visualize/index.html)
```Bash
# Download colmap cache for e2cedefea8a0ed2d0ffbd5bdc08acbe7e1f85c96f72f7b790e9dfe1c98963047, output to DL3DV-10K directory, ignore the resolution options
python download.py --odir DL3DV-10K --subset 1K --resolution 480P --file_type colmap_cache --hash e2cedefea8a0ed2d0ffbd5bdc08acbe7e1f85c96f72f7b790e9dfe1c98963047 --clean_cache
```
# News
- [x] DL3DV-1K, 2K, 3K, 4K
- [ ] DL3DV-5K ~ 10K |
ThankGod/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Andrew_Ng
'1': Elon_Musk
'2': Jay_Z
'3': Kanye
'4': Obama
'5': Queen
splits:
- name: train
num_bytes: 624532.0
num_examples: 16
download_size: 626669
dataset_size: 624532.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/rs_test | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 89445
num_examples: 16
- name: test
num_bytes: 76931
num_examples: 16
download_size: 0
dataset_size: 166376
---
# Dataset Card for `HuggingFaceH4/rs_test`
* SFT model: HuggingFaceH4/falcon-40b-ift-v3.1
* Reward model: HuggingFaceH4/pythia-70m-rm-v0.0
* Temperature: 0.7 |
twdent/Hiking | ---
task_categories:
- image-segmentation
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 316794997.0
num_examples: 38
download_size: 0
dataset_size: 316794997.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset card for Hiking
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset description](#dataset-description)
- [Dataset categories](#dataset-categories)
## Dataset description
- **Homepage:** https://segments.ai/twdent/Hiking
This dataset was created using [Segments.ai](https://segments.ai). It can be found [here](https://segments.ai/twdent/Hiking).
## Dataset categories
| Id | Name | Description |
| --- | ---- | ----------- |
| 1 | traversable | - |
| 2 | non-traversable | - |
|
CyberHarem/evelynn_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of evelynn (League of Legends)
This is the dataset of evelynn (League of Legends), containing 73 images and their tags.
The core tags of this character are `long_hair, purple_hair, yellow_eyes, breasts, earrings, sunglasses, tinted_eyewear, looking_over_eyewear, pink-tinted_eyewear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 91.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelynn_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 54.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelynn_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 137 | 99.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelynn_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 81.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelynn_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 137 | 140.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelynn_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/evelynn_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, k/da_(league_of_legends), looking_at_viewer, solo, bare_shoulders, lipstick, claws, fur_trim, detached_sleeves, halterneck, crop_top, idol, necklace, parted_lips, pince-nez, high-waist_skirt, midriff, high_heels, medium_breasts, microphone, smile, black_skirt, bracelet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | k/da_(league_of_legends) | looking_at_viewer | solo | bare_shoulders | lipstick | claws | fur_trim | detached_sleeves | halterneck | crop_top | idol | necklace | parted_lips | pince-nez | high-waist_skirt | midriff | high_heels | medium_breasts | microphone | smile | black_skirt | bracelet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------------|:--------------------|:-------|:-----------------|:-----------|:--------|:-----------|:-------------------|:-------------|:-----------|:-------|:-----------|:--------------|:------------|:-------------------|:----------|:-------------|:-----------------|:-------------|:--------|:--------------|:-----------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_1_t_0.9 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43705642
num_examples: 18928
- name: epoch_1
num_bytes: 44279461
num_examples: 18928
- name: epoch_2
num_bytes: 44343337
num_examples: 18928
- name: epoch_3
num_bytes: 44374826
num_examples: 18928
- name: epoch_4
num_bytes: 44389402
num_examples: 18928
- name: epoch_5
num_bytes: 44386360
num_examples: 18928
- name: epoch_6
num_bytes: 44376471
num_examples: 18928
- name: epoch_7
num_bytes: 44372604
num_examples: 18928
- name: epoch_8
num_bytes: 44368001
num_examples: 18928
- name: epoch_9
num_bytes: 44362699
num_examples: 18928
- name: epoch_10
num_bytes: 44363222
num_examples: 18928
- name: epoch_11
num_bytes: 44363342
num_examples: 18928
- name: epoch_12
num_bytes: 44363674
num_examples: 18928
- name: epoch_13
num_bytes: 44364103
num_examples: 18928
- name: epoch_14
num_bytes: 44363329
num_examples: 18928
- name: epoch_15
num_bytes: 44364778
num_examples: 18928
- name: epoch_16
num_bytes: 44363355
num_examples: 18928
- name: epoch_17
num_bytes: 44365003
num_examples: 18928
- name: epoch_18
num_bytes: 44364099
num_examples: 18928
- name: epoch_19
num_bytes: 44364622
num_examples: 18928
- name: epoch_20
num_bytes: 44364511
num_examples: 18928
- name: epoch_21
num_bytes: 44363902
num_examples: 18928
- name: epoch_22
num_bytes: 44364063
num_examples: 18928
- name: epoch_23
num_bytes: 44364764
num_examples: 18928
- name: epoch_24
num_bytes: 44364854
num_examples: 18928
- name: epoch_25
num_bytes: 44364043
num_examples: 18928
- name: epoch_26
num_bytes: 44364184
num_examples: 18928
- name: epoch_27
num_bytes: 44363332
num_examples: 18928
- name: epoch_28
num_bytes: 44364482
num_examples: 18928
- name: epoch_29
num_bytes: 44363888
num_examples: 18928
download_size: 697305486
dataset_size: 1330240353
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
yentinglin/grammar-correction | ---
dataset_info:
features:
- name: _id
dtype: string
- name: task
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 17749107
num_examples: 69071
- name: validation
num_bytes: 643075
num_examples: 1712
download_size: 10350382
dataset_size: 18392182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Denisilva/VOZCANALman | ---
license: openrail
---
|
hugfaceguy0001/simpsons_info | ---
dataset_info:
features:
- name: id
dtype: int64
- name: season
dtype: int64
- name: episode_id_in_season
dtype: int64
- name: title
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: plot
dtype: string
splits:
- name: train
num_bytes: 2294684
num_examples: 750
download_size: 1403036
dataset_size: 2294684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: openrail
task_categories:
- text-classification
- text-generation
- text2text-generation
- video-classification
language:
- en
tags:
- art
- culture
- popular
- video
pretty_name: simpsons
size_categories:
- n<1K
---
The information of all episodes of the cartoon show "The Simpsons" from wikipedia. Some (mainly in recent 32, 33, 34 seasons) plot missing. |
Cohere/miracl-hi-queries-22-12 | ---
annotations_creators:
- expert-generated
language:
- hi
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (hi) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-hi-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-hi-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-hi-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-hi-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-hi-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-hi-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-hi-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-hi-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-hi-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-hi-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-hi-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-hi-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
trl-internal-testing/tldr-preference-trl-style | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
splits:
- name: train
num_bytes: 597814060
num_examples: 92858
- name: validation
num_bytes: 543890585
num_examples: 83802
- name: validation_cnndm
num_bytes: 35776521
num_examples: 2284
download_size: 139401121
dataset_size: 1177481166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: validation_cnndm
path: data/validation_cnndm-*
---
# TRL's TL;DR Preference Dataset
We preprocess the dataset using our standard `prompt, chosen, rejected` format.
## Reproduce this dataset
1. Download the `tldr_preference.py` from the https://huggingface.co/datasets/trl-internal-testing/tldr-preference-trl-style/tree/0.1.0.
2. Run `python examples/datasets/tldr_preference.py --push_to_hub --hf_entity trl-internal-testing`
|
daspartho/agree_disagree | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: statement
dtype: string
- name: reply
dtype: string
- name: sentiment
dtype: int64
splits:
- name: train
num_bytes: 267030
num_examples: 1660
download_size: 113328
dataset_size: 267030
---
# Dataset Card for "agree_disagree"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MartinKu/bookcorpus_stage2_coverage | ---
dataset_info:
features:
- name: text
dtype: string
- name: S_V_position
sequence: int64
- name: O_C_position
sequence: int64
- name: start_point_list
sequence: int64
splits:
- name: train
num_bytes: 41837757690
num_examples: 74004228
download_size: 5208316237
dataset_size: 41837757690
---
# Dataset Card for "bookcorpus_stage2_coverage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713142785 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 59936
num_examples: 153
download_size: 38548
dataset_size: 59936
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
euisuh15/poison-cwe | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test1
path: data/test1-*
- split: test2
path: data/test2-*
- split: val
path: data/val-*
- split: new_test1
path: data/new_test1-*
- split: new_test2
path: data/new_test2-*
dataset_info:
features:
- name: file_change_id
dtype: int64
- name: method_change_id
dtype: int64
- name: code
dtype: string
- name: name
dtype: string
- name: cwe_id
dtype: string
- name: cve_id
dtype: string
- name: before_change
dtype: bool
- name: index
dtype: int64
- name: index_grouped
dtype: string
- name: count
dtype: float64
- name: type
dtype: string
- name: output
dtype: string
- name: input
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4045827
num_examples: 1798
- name: test1
num_bytes: 539359
num_examples: 226
- name: test2
num_bytes: 745301
num_examples: 308
- name: val
num_bytes: 339243
num_examples: 146
- name: new_test1
num_bytes: 66028
num_examples: 20
- name: new_test2
num_bytes: 35658
num_examples: 20
download_size: 73465
dataset_size: 5771416
---
# Dataset Card for "poison-cwe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_no_gender_distinction | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 20322
num_examples: 108
- name: test
num_bytes: 15434
num_examples: 88
- name: train
num_bytes: 69365
num_examples: 368
download_size: 78727
dataset_size: 105121
---
# Dataset Card for "MULTI_VALUE_stsb_no_gender_distinction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baptistecolle/sam-controlnet-5 | ---
dataset_info:
features:
- name: masks
dtype: image
splits:
- name: train
num_bytes: 140788170.0
num_examples: 1000
download_size: 0
dataset_size: 140788170.0
---
# Dataset Card for "sam-controlnet-5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naorm/DNRTI | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2602589
num_examples: 145609
- name: validation
num_bytes: 324626
num_examples: 18264
- name: test
num_bytes: 326502
num_examples: 18380
download_size: 1547968
dataset_size: 3253717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
AshtonIsNotHere/biosift | ---
dataset_info:
features:
- name: PMID
dtype: int64
- name: Title
dtype: string
- name: Abstract
dtype: string
- name: Split
dtype: string
- name: Number of Annotators
dtype: int64
- name: Aggregate
dtype: int64
- name: Has Human Subjects
dtype: float64
- name: Has Target Disease
dtype: float64
- name: Cohort Study or Clinical Trial
dtype: float64
- name: Has Quantitative Outcome Measure
dtype: float64
- name: Has Study Drug(s)
dtype: float64
- name: Has Population Size
dtype: float64
- name: Has Comparator Group
dtype: float64
- name: label
sequence: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 15286088
num_examples: 8005
- name: validation
num_bytes: 1931610
num_examples: 997
- name: test
num_bytes: 1923714
num_examples: 998
download_size: 9802250
dataset_size: 19141412
---
# Dataset Card for "biosift"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/misaka_imouto_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of misaka_imouto (To Aru Majutsu no Index)
This is the dataset of misaka_imouto (To Aru Majutsu no Index), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
misshimichka/flower_faces_dataset_v3 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 97085058.0
num_examples: 69
download_size: 97088269
dataset_size: 97085058.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
saibala29/Pokedex_Data | ---
license: mit
---
# Pokémon Dataset Overview 📊
This dataset provides a comprehensive compilation of Pokémon data 🎮, covering various aspects such as stats, types, generations, and legendary status. It's designed for enthusiasts, researchers, and developers interested in exploring Pokémon data for analysis, machine learning models, and application development 🚀.
## Dataset Description 📝
The Pokémon dataset includes the following key features:
- **Name**: The name of the Pokémon. 🧚
- **Type 1**: The primary type of the Pokémon. 🔥/💧/🌿
- **Type 2**: The secondary type of the Pokémon (if any). ⚡/🪨/🧊
- **Total**: Sum of all stats, providing an overall strength rating. 💪
- **HP**: Hit Points or health. ❤️
- **Attack**: The base modifier for normal attacks. 🗡️
- **Defense**: The base damage resistance against normal attacks. 🛡️
- **Sp. Atk**: Special Attack, the base modifier for special attacks. ✨
- **Sp. Def**: Special Defense, the base damage resistance against special attacks. 🌟
- **Speed**: Determines how quickly a Pokémon can act in battle. 💨
- **Generation**: Indicates the generation a Pokémon belongs to. 🔄
- **Legendary**: Indicates whether a Pokémon is legendary. 🌈
## Dataset Structure 🏗️
### Files and Folders 📁
- `Pokemon.csv`: Main dataset file containing all Pokémon data. 📄
- `Pokemon_Final_Fixed_Questions_Queries.csv`: Contains questions and MongoDB queries related to the Pokémon dataset, useful for database exercises and training AI models. 🤔💡
### Data Fields 🛠️
A brief description of the dataset fields is as follows:
- `Name`: String 📛
- `Type 1`: String 🔥/💧/🌿
- `Type 2`: String (nullable) ⚡/🪨/🧊
- `Total`, `HP`, `Attack`, `Defense`, `Sp. Atk`, `Sp. Def`, `Speed`: Integer 📊
- `Generation`: Integer 🔄
- `Legendary`: Boolean ✨
## Usage 📚
This dataset can be utilized for various purposes, including but not limited to:
- Data analysis and visualization of Pokémon characteristics. 📈
- Training machine learning models to predict outcomes of Pokémon battles. 🤖
- Developing applications or games that leverage Pokémon data. 🎮
## Acknowledgements 🙏
This dataset is made available for educational and research purposes. Please respect the Pokémon trademark and use this dataset responsibly.
## License 📜
This dataset is provided for non-commercial, research, or educational purposes. Please review the specific license terms if applicable.
|
anan-2024/twitter_dataset_1713148495 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 69405
num_examples: 181
download_size: 42411
dataset_size: 69405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aasarap/allfaq | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 216215.3
num_examples: 721
- name: test
num_bytes: 92663.7
num_examples: 309
download_size: 115318
dataset_size: 308879.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
s2e-lab/RegexEval | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- regex
- redos
- security
pretty_name: RegexEval
size_categories:
- n<1K
---
# Dataset Card for RegexEval
<!-- Provide a quick summary of the dataset. -->
Re(gEx|DoS)Eval is a framework that includes a dataset of 762 regex descriptions (prompts) from real users, refined prompts with examples, and a robust set of tests.
## Dataset Details
### Dataset Description
- **Curated by:** Mohammed Latif Siddiq, Jiahao Zhang, Lindsay Roney, and Joanna C. S. Santos
- **Language(s):** English
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/s2e-lab/RegexEval
- **Paper:** https://s2e-lab.github.io/preprints/icse_nier24-preprint.pdf
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
- dataset.jsonl: dataset file in jsonl format. Every line contains a JSON object with the following fields:
- `id`: unique identifier of the sample.
- `raw_prompt`: Raw/original prompt from the real users with the description of the RegEx.
- `refined_prompt`: Refined prompt with the description of the RegEx.
- `matches`: Matches examples for the RegEx.
- `non-matches`: Non-matches examples for the RegEx.
## Dataset Creation
### Source Data
We mined (on Aug. 16th, 2023) all the regexes from [RegExLib](https://regexlib.com/), a regular expression library. We use this library because it contains user-contributed regular expressions.
We obtained from RegExLib a list of 4,128 regular expressions along with their id, description, and list of expected matches and non-match strings.
#### Data Collection and Processing
For each sample previously collected, we perform a manual validation to (1) filter out incorrect regexes, (2) create more sample test cases (i.e., matching and non-matching string examples), and (3) create refined problem descriptions (i.e., prompts).
We excluded any regex that matched one or more of the following conditions: (i) it was missing any metadata, i.e., description and/or list of expected matches and non- matches; (ii) its description is not written in English; (iii) its description included vulgar words; (iv) its description does not provide sufficient information to understand the purpose of the regular expression; (v) it aimed to detect just one word; (vi) it is incorrect (i.e., the regex matches a string that is not supposed to match, or it does not match a string that is expected to match). After this step, we have 1,001 regex samples.
Each collected regex sample had (on average) only 4 string examples (2 that are expected matches and 2 that are expected non-matches). Thus, we manually crafted additional test cases to ensure that each sample has at least 13 matching1 and 12 non-matching string examples. After creating these additional test strings, we evaluated the regex with the new set of test cases again and excluded the failed regex samples. Hence, we have 762 samples in our final dataset.
Upon further inspection of the descriptions in the extracted sample, we observed that some of them lacked a more detailed explanation (e.g., ID#84: “SQL date format tester.”) or had extra information unrelated to the regex (e.g., ID#4: “... Other than that, this is just a really really long description of a regular expression that I’m using to test how my front page will look in the case where very long expression descriptions are used”). Thus, we created a refined prompt with a clear description of the regex that includes three match and two non-match string examples.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@inproceedings{siddiq2024regexeval,
author={Siddiq, Mohammed Latif and Zhang, Jiahao and Roney, Lindsay and Santos, Joanna C. S.},
booktitle={Proceedings of the 46th International Conference on Software Engineering, NIER Track (ICSE-NIER '24)},
title={Re(gEx|DoS)Eval: Evaluating Generated Regular Expressions and their Proneness to DoS Attacks},
year={2024}
}
```
## Dataset Card Authors and Contact
[Mohammed Latif Siddiq](http://lsiddiqsunny.github.io) |
ai-habitat/hab3_bench_assets | ---
license: cc-by-nc-4.0
viewer: false
---
# Habitat v0.3.x Benchmark Dataset
Assets, configs, and episodes for reproduceable benchmarking on Habitat v0.3.x.
## Setup
Clone this repo and symblink as `data/hab3_bench_assets` in habitat-lab directory.
Download the [Habitat compatable YCB SceneDataset](https://huggingface.co/datasets/ai-habitat/ycb) and create a symbolic link in `data/objects/ycb` or use the habitat-sim datasets_download script ([README](https://github.com/facebookresearch/habitat-sim/blob/main/DATASETS.md#ycb-benchmarks---object-and-model-set)).
## Contents:
- Scene Dataset: `hab3-hssd/` - the necessary configs and assets to load a subset of HSSD dataset into habitat-lab and utilize it for Hab3 rearrangement tasks.
- Episode Datasets: `episode_datasets` - a set of serialized RearrangeDataset files generated for the benchmark SceneDataset. See "Generating New Episodes" below for details.
- `hab3_bench_ep_gen_config.yaml` - config file for generating new RearrangeDataset files.
- Example Humanoid assets - URDF, skin meshes, motion files for one humanoid.
## Generating New Episodes:
The provided config `hab3_bench_ep_gen_config.yaml` is available for generating new hab3 benchmarking episodes. It defines the scene, objects, and generator configs (e.g. number of clutter objects).
The generator command should be run on a Habitat 3.0 compatable branch (e.g. SIRo) with the included assets from `fpss/fphab` commit `cd1549303d759abacbb377a8dd52c5f7af0d0e5a` as follows:
```
python -u habitat-lab/habitat/datasets/rearrange/run_episode_generator.py --config data/hab3_bench_assets/hab3_bench_ep_gen_config.yaml --run --verbose --num-episodes 10 --seed 0 --out data/hab3_bench_assets/episode_datasets/large_large.json.gz
```
Naming of the episode file `<scene_complexity>_<object_complexity>.json.gz` depends on the following parameters:
### Scene Complexity:
Currently we are testing on 3 differently sized scenes:
- `small`: 103997919_171031233 (area 35.92) - 1 bed, 1 bath
- `medium`: 108736635_177263256 (area 55.49) - 3 bed, 2 bath
- `large`: 102816009 (area 172.43) 4 bed, 4 bath + den & office
One of these scene sets must be selected in the config before generation.
### Object Complexity:
Currently we are testing 3 clutter object size complexities:
- `small`: 2 objects
- `medium`: 5 objects
- `large`: 10 objects
One of these sampler params must be selected in the config before generation.
## License Notes:
HSSD assets and episodes are provided under cc-by-nc license as a subset of the dataset described here: https://3dlg-hcvc.github.io/hssd/
Example humanoid asset shapes are provided under cc-by-nc license and motions under [SMPL Body Motion File License ](https://smpl.is.tue.mpg.de/bodylicense.html) as a subset of https://huggingface.co/datasets/ai-habitat/habitat_humanoids
|
TrainingDataPro/anti-spoofing_replay | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
language:
- en
tags:
- finance
- legal
- code
dataset_info:
features:
- name: live_video_id
dtype: string
- name: phone
dtype: string
- name: video_file
dtype: string
- name: phone_video_playback
dtype: string
- name: worker_id
dtype: string
splits:
- name: train
num_bytes: 5063
num_examples: 30
download_size: 735628032
dataset_size: 5063
---
# Anti-Spoofing dataset: replay
The dataset consists of 30,000+ videos of replay attacks from people from 157 countries.
It is based on data from **Anti Spoofing Real Dataset**: https://huggingface.co/datasets/TrainingDataPro/anti-spoofing_Real.
The dataset solves tasks in the field of anti-spoofing and it is useful for buisness and safety systems.
The dataset includes: **replay attacks** - videos from Antispoofing Real filmed on the phone.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/anti-spoofing-replay?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing_replay) to discuss your requirements, learn about the price and buy the dataset.
# File with the extension .csv
includes the following information for each media file:
- **live_video_id**: the unique identifier of the "Antispoofing Live" video
- **phone**: the device used to capture the replay video,
- **link**: the URL to access the replay video,
- **phone_video_payback**: the device used to play the "Antispoofing Live" video,
- **worker_id**: the identifier of the person who provided the media file,
# Folder "img" with media files
- containg all the photos and videos
- which correspond to the data in the .csv file
**How it works**: *go to the first folder and you will make sure that it contains media files taken by a person whose parameters are specified in the first line of the .csv file.*
## [**TrainingData**](https://trainingdata.pro/data-market/anti-spoofing-replay?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing_replay) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
mteb/neuclir-2023-zho | ---
language:
- zho
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- neuclir
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 27638
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 3179209
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 76
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
From the NeuCLIR TREC Track 2023: https://arxiv.org/abs/2304.12367
Generated from https://huggingface.co/datasets/neuclir/neuclir1
```
@article{lawrie2024overview,
title={Overview of the TREC 2023 NeuCLIR Track},
author={Lawrie, Dawn and MacAvaney, Sean and Mayfield, James and McNamee, Paul and Oard, Douglas W and Soldaini, Luca and Yang, Eugene},
url={https://trec.nist.gov/pubs/trec32/papers/Overview_neuclir.pdf},
year={2024}
}
```
|
BangumiBase/engagekiss | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Engage Kiss
This is the image base of bangumi Engage Kiss, we detected 16 characters, 1252 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 176 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 166 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 64 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 34 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 324 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 57 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 85 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 44 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 15 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 24 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 14 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 80 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 28 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 10 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 101 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
sevin987/KoChatGpt | ---
license: unknown
---
|
CyberHarem/mariabell_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mariabell (Fire Emblem)
This is the dataset of mariabell (Fire Emblem), containing 45 images and their tags.
The core tags of this character are `blonde_hair, bow, hair_bow, long_hair, drill_hair, earrings, breasts, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 37.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mariabell_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 27.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mariabell_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 88 | 50.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mariabell_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 35.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mariabell_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 88 | 61.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mariabell_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mariabell_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, jewelry, looking_at_viewer, open_mouth, ascot, pink_gloves, smile, umbrella |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | looking_at_viewer | open_mouth | ascot | pink_gloves | smile | umbrella |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------------------|:-------------|:--------|:--------------|:--------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/VALUE_rte_negative_concord | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 6788
num_examples: 12
- name: test
num_bytes: 81330
num_examples: 164
- name: train
num_bytes: 76553
num_examples: 149
download_size: 11963
dataset_size: 164671
---
# Dataset Card for "VALUE_rte_negative_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sedexjd/vitao | ---
license: openrail
---
|
gguichard/wsd_myriade_synth_data_gpt4turbo_5 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 25754440
num_examples: 39527
download_size: 5424029
dataset_size: 25754440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YMKiii/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6995518.0
num_examples: 7
download_size: 6997474
dataset_size: 6995518.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xxxlllfff/ffff | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 72
num_examples: 3
download_size: 1218
dataset_size: 72
---
# Dataset Card for "ffff"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a | ---
pretty_name: Evaluation run of maywell/PiVoT-0.1-Evil-a
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/PiVoT-0.1-Evil-a](https://huggingface.co/maywell/PiVoT-0.1-Evil-a) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T18:10:37.734166](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a/blob/main/results_2023-12-03T18-10-37.734166.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4040940106141016,\n\
\ \"acc_stderr\": 0.01351675297272172\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.4040940106141016,\n \"acc_stderr\": 0.01351675297272172\n\
\ }\n}\n```"
repo_url: https://huggingface.co/maywell/PiVoT-0.1-Evil-a
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_03T18_05_40.726563
path:
- '**/details_harness|gsm8k|5_2023-12-03T18-05-40.726563.parquet'
- split: 2023_12_03T18_10_37.734166
path:
- '**/details_harness|gsm8k|5_2023-12-03T18-10-37.734166.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T18-10-37.734166.parquet'
- config_name: results
data_files:
- split: 2023_12_03T18_05_40.726563
path:
- results_2023-12-03T18-05-40.726563.parquet
- split: 2023_12_03T18_10_37.734166
path:
- results_2023-12-03T18-10-37.734166.parquet
- split: latest
path:
- results_2023-12-03T18-10-37.734166.parquet
---
# Dataset Card for Evaluation run of maywell/PiVoT-0.1-Evil-a
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maywell/PiVoT-0.1-Evil-a
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [maywell/PiVoT-0.1-Evil-a](https://huggingface.co/maywell/PiVoT-0.1-Evil-a) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T18:10:37.734166](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a/blob/main/results_2023-12-03T18-10-37.734166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4040940106141016,
"acc_stderr": 0.01351675297272172
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.01351675297272172
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
soda-lmu/tweet-annotation-sensitivity-1 | ---
task_categories:
- text-classification
language:
- en
task_ids:
- sentiment-classification
- hate-speech-detection
size_categories:
- 1K<n<10K
---
# Tweet Annotation Sensitivity Experiment 1: Annotation in Six Experimental Conditions
***<font color= red>Attention: This repository contains cases that might be offensive or upsetting. We do not support the views expressed in these hateful posts.</font>***
## Description
We drew a stratified sample of 20 tweets, that were pre-annotated in a study by [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955) for Hate Speech / Offensive Language / Neither. The stratification was done with respect to majority-voted class and level of disagreement.
We then recruited 1000 [Prolific](https://www.prolific.com/) workers to annotate each of the 20 tweets. Annotators were randomly selected into one of six experimental conditions, as shown in the following figures. In these conditions, they were asked to assign the labels Hate Speech / Offensive Language / Neither.
In addition, we collected a variety of demographic variables (e.g. age and gender) and some para data (e.g. duration of the whole task, duration per screen).
<img src="https://raw.githubusercontent.com/boleima/tweet-annotation-supplements/main/figs/image.png" width = "500" height = "250" alt="" align=center />
## Examples of Annotation Conditions
Here we show some examples of the annotation conditions:
Condition A
<img src="https://raw.githubusercontent.com/boleima/tweet-annotation-supplements/main/figs/condition_a.png" width = "500" height = "250" alt="" align=center />
Condition C
<img src="https://raw.githubusercontent.com/boleima/tweet-annotation-supplements/main/figs/condition_c.png" width = "500" height = "250" alt="" align=center />
Condition E
<img src="https://raw.githubusercontent.com/boleima/tweet-annotation-supplements/main/figs/condition_e.png" width = "500" height = "250" alt="" align=center />
## Tweets
We use the following 20 tweets from [Davidson et al. (2017)](https://ojs.aaai.org/index.php/ICWSM/article/view/14955). The usernames were hashed.
| Tweet ID | Tweets Hashed |
| -------------- | ------------------ |
|1|Jihadi John has struck again. Prayers with the victims' families. What is the #bounty on His head? Time 4 him 2 meet allah|
|2|RT @#####: Seriously, Florida. A vote for Charlie Christ is a vote for taxes and ebola.|
|3|Islamic Jihadis run away to Pakistan @##### @##### #UniteBlue|
|4|At this rate, I'd cheer for the awful New York Yankees over the St. Louis Cardinals.|
|5|Allergies have my eyes all red and chink eyed, I look like I'm stoned|
|6|Not protected in the state of Utah. There is no fear of queer couples forcing churches to marry them. They do not legally have to do so.|
|7|RT @#####: Boss Niggas Fuk wit Boss bitches Vice versa..|
|8|RT @#####: ever wanna text somebody and be like bitch die I hope you don't wake up you low down dirty muthafxcka😈|
|9|@##### @##### @##### she's a right twat, I'd tell you, honestly Sky! Honestly 😈|
|10|So yall give farwells to Lulu yet did all but shit on Kris? Ok bitches... ok|
|11|RT @#####: Dis bitch wanna smoke all our weed|
|12|Shittin on all my old hoes|
|13|Getting hit off a cunt thats about 6 foot sideways and a cunt that was in the army lol, i dini care, yous are fuckin laughable|
|14|When it rains it's good "go to the mall weather" all the coons be at home|
|15|Don't mind black people.....hate niggers!|
|16|My eyes feel so chink eyed|
|17|"From now on, I will call all radical MUSLIMS niggers!It is very fitting and it is racist! I AM RACIST AGAINST ANYONE WHO HATES AMERICA!"""|
|18|RT @#####: Fucking chinks in Clash of Clans always fuck me up. Bitch ass kamikaze faggots.|
|19|@##### ya I know all the slang I'm racist I h8 porch monkies|
|20|RT @#####: Stupid fucking nigger LeBron. You flopping stupid jungle bunny monkey faggot.|
## Codebook
| Column Name | Description | Type |
| -------------- | ------------------ |---------------- |
|id|annotator ID|integer|
|age|Age|integer|
|gender|Gender<br> 1: Female<br>2: Male<br> 3: Something Else<br> 4: Prefer not to say<br> |factor|
|afam|African-American<br> 0: No<br> 1: Yes|binary|
|asian|Asian-American<br> 0: No<br> 1: Yes|binary|
|hispanic|Hispanic<br> 0: No<br> 1: Yes|binary|
|white|White<br> 0: No<br> 1: Yes|binary|
|race_other|Other race/ethnicity<br> 0: No<br> 1: Yes|binary|
|race_not_say|Prefer not to say race/ethnicity<br> 0: No<br> 1: Yes|binary|
|education|Highest educational attainment<br> 1: Less than high school<br>2: High school<br> 3: Some college<br> 4: College graduate<br> 5: Master's degree or professional degree (Law, Medicine, MPH, etc.) <br> 6: Doctoral degree (PhD, DPH, EdD, etc.)|factor|
|sexuality|Sexuality<br> 1: Gay or Lesbian<br>2: Bisexual<br> 3: Straight<br> 4: Something Else<br> |factor|
|english|English first language? <br> 0: No<br> 1: Yes|binary|
|tw_use|Twitter Use <br> 1: Most days<br>2: Most weeks, but not every day<br> 3: A few times a month<br> 4: A few times a year<br> 5: Less often <br> 6: Never|factor|
|social_media_use|Social Media Use<br> 1: Most days<br>2: Most weeks, but not every day<br> 3: A few times a month<br> 4: A few times a year<br> 5: Less often <br> 0: Never|factor|
|prolific_hours|Prolific hours worked last month|integer|
|task_fun|Coding work was: fun<br> 0: No<br> 1: Yes|binary|
|task_interesting|Coding work was: interesting<br> 0: No<br> 1: Yes|binary|
|task_boring|Coding work was: boring<br> 0: No<br> 1: Yes|binary|
|task_repetitive|Coding work was: repetitive<br> 0: No<br> 1: Yes|binary|
|task_important|Coding work was: important<br> 0: No<br> 1: Yes|binary|
|task_depressing|Coding work was: depressing<br> 0: No<br> 1: Yes|binary|
|task_offensive|Coding work was: offensive<br> 0: No<br> 1: Yes|binary|
|another_tweettask|Likelihood to do another Tweet related task<br> not at all: Not at all likely<br> somewhat: Somewhat likely<br> very: Very likely|factor|
|another_hatetask|Likelihood to do another Hate Speech related task<br> not at all: Not at all likely<br> somewhat: Somewhat likely<br> very: Very likely|factor|
|page_history|Order in which annotator saw pages|character|
|date_of_first_access|Datetime of first access|datetime|
|date_of_last_access|Datetime of last access|datetime|
|duration_sec|Task duration in seconds|integer|
|version|Version of annotation task <br> A: Version A<br>B: Version B<br> C: Version C<br> D: Version D<br> E: Version E<br> F: Version F|factor|
|tw1-20|Label assigned to Tweet 1-20<br> hate speech: Hate Speech<br> offensive language: Offensive Language<br> neither: Neither HS nor OL <br> NA: Missing or "don't know"|factor|
|tw_duration_1-20|Annotation duration in milliseconds Tweet 1-20|numerical|
|num_approvals|Prolific data: number of previous task approvals of annotator|integer|
|num_rejections|Prolific data: number of previous task rejections of annotator|integer|
|prolific_score|Annotator quality score by Prolific|numerical|
|countryofbirth|Prolific data: Annotator country of birth|character|
|currentcountryofresidence|Prolific data: Annotator country of residence|character|
|employmentstatus|Prolific data: Annotator Employment Status<br> Full-timePart-time<br> Unemployed (and job-seeking)<br> Due to start a new job within the next month<br> Not in paid work (e.g. homemaker, retired or disabled)<br> Other<br> DATA EXPIRED|factor|
|firstlanguage|Prolific data: Annotator first language|character|
|nationality|Prolific data: Nationality|character|
|studentstatus|Prolific data: Student status<br> Yes<br> No <br> DATA EXPIRED|factor|
## Citation
If you found the dataset useful, please cite:
```
@InProceedings{beck2022,
author="Beck, Jacob and Eckman, Stephanie and Chew, Rob and Kreuter, Frauke",
editor="Chen, Jessie Y. C. and Fragomeni, Gino and Degen, Helmut and Ntoa, Stavroula",
title="Improving Labeling Through Social Science Insights: Results and Research Agenda",
booktitle="HCI International 2022 -- Late Breaking Papers: Interacting with eXtended Reality and Artificial Intelligence",
year="2022",
publisher="Springer Nature Switzerland",
address="Cham",
pages="245--261",
isbn="978-3-031-21707-4"
}
```
|
akil-elkamel/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4201526
num_examples: 1000
download_size: 2247083
dataset_size: 4201526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
StarfleetAI/Code-290k-ShareGPT-MarkedLanguage | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 548206711
num_examples: 289094
download_size: 268926435
dataset_size: 548206711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Code-290k-ShareGPT-MarkedLanguage
It's [ajibawa-2023/Code-290k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-290k-ShareGPT), but each example is marked with the programming language it uses.
The detection was performed using heuristics, so there could be inaccuracies. Pull requests are welcome! |
jmichaelov/inverse_scaling_prize-sig_figs | ---
license: cc-by-4.0
task_categories:
- multiple-choice
language:
- en
pretty_name: Sig Figs
--- |
Tristan/olm-wikipedia-20221220-1-percent | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 209366020.9708762
num_examples: 65879
download_size: 123017868
dataset_size: 209366020.9708762
---
# Dataset Card for "olm-wikipedia-20221220-1-percent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dwadden/science_adapt | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1219380696
num_examples: 328686
download_size: 551759701
dataset_size: 1219380696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tdh87/Randomized | ---
license: apache-2.0
---
|
DynamicSuperb/AccentClassification_AccentdbExtended | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 91766295.41136718
num_examples: 200
download_size: 61234603
dataset_size: 91766295.41136718
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "accent_classification_accentdb_extended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathan-roberts1/MLRSNet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
sequence:
class_label:
names:
'0': airplane
'1': airport
'2': bare soil
'3': baseball diamond
'4': basketball court
'5': beach
'6': bridge
'7': buildings
'8': cars
'9': chaparral
'10': cloud
'11': containers
'12': crosswalk
'13': dense residential area
'14': desert
'15': dock
'16': factory
'17': field
'18': football field
'19': forest
'20': freeway
'21': golf course
'22': grass
'23': greenhouse
'24': gully
'25': habor
'26': intersection
'27': island
'28': lake
'29': mobile home
'30': mountain
'31': overpass
'32': park
'33': parking lot
'34': parkway
'35': pavement
'36': railway
'37': railway station
'38': river
'39': road
'40': roundabout
'41': runway
'42': sand
'43': sea
'44': ships
'45': snow
'46': snowberg
'47': sparse residential area
'48': stadium
'49': swimming pool
'50': tanks
'51': tennis court
'52': terrace
'53': track
'54': trail
'55': transmission tower
'56': trees
'57': water
'58': wetland
'59': wind turbine
splits:
- name: train
num_bytes: 1327782862.875
num_examples: 109161
download_size: 1304951717
dataset_size: 1327782862.875
license: cc-by-4.0
---
# Dataset Card for "MLRSNet"
## Dataset Description
- **Paper:** [MLRSNet: A multi-label high spatial resolution remote sensing dataset for semantic scene understanding](https://www.sciencedirect.com/science/article/pii/S0924271620302677)
### Licensing Information
CC BY 4.0
## Citation Information
[MLRSNet: A multi-label high spatial resolution remote sensing dataset for semantic scene understanding](https://www.sciencedirect.com/science/article/pii/S0924271620302677)
```
@article{qi2020mlrsnet,
title = {MLRSNet: A multi-label high spatial resolution remote sensing dataset for semantic scene understanding},
author = {Qi, Xiaoman and Zhu, Panpan and Wang, Yuebin and Zhang, Liqiang and Peng, Junhuan and Wu, Mengfan and Chen, Jialong and Zhao, Xudong and Zang, Ning and Mathiopoulos, P Takis},
year = 2020,
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
publisher = {Elsevier},
volume = 169,
pages = {337--350}
}
``` |
pravsels/videos_3b1b_code | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 13144954
num_examples: 353
download_size: 4516850
dataset_size: 13144954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shmarymane/worldai | ---
license: mit
---
|
huggingartists/5nizza | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/5nizza"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.13617 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/289ded19d51d41798be99217d6059eb3.458x458x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/5nizza">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">5’Nizza</div>
<a href="https://genius.com/artists/5nizza">
<div style="text-align: center; font-size: 14px;">@5nizza</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/5nizza).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/5nizza")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|51| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/5nizza")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep | ---
pretty_name: Evaluation run of BFauber/opt125m_10e5_1ep
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt125m_10e5_1ep](https://huggingface.co/BFauber/opt125m_10e5_1ep) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T19:20:39.939834](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep/blob/main/results_2024-02-02T19-20-39.939834.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26615727535166933,\n\
\ \"acc_stderr\": 0.03094238339855658,\n \"acc_norm\": 0.26747140124253277,\n\
\ \"acc_norm_stderr\": 0.031765035154689494,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.42531103718835517,\n\
\ \"mc2_stderr\": 0.014922459227887219\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702252,\n\
\ \"acc_norm\": 0.23464163822525597,\n \"acc_norm_stderr\": 0.01238387356076866\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2877912766381199,\n\
\ \"acc_stderr\": 0.004518080594528024,\n \"acc_norm\": 0.3090021907986457,\n\
\ \"acc_norm_stderr\": 0.004611377019520816\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105655,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3394495412844037,\n \"acc_stderr\": 0.02030210934266235,\n \"\
acc_norm\": 0.3394495412844037,\n \"acc_norm_stderr\": 0.02030210934266235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.19831223628691982,\n \"acc_stderr\": 0.025955020841621115,\n\
\ \"acc_norm\": 0.19831223628691982,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n\
\ \"acc_stderr\": 0.022574519424174884,\n \"acc_norm\": 0.13004484304932734,\n\
\ \"acc_norm_stderr\": 0.022574519424174884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287414,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287414\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\
\ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\
\ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n\
\ \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n\
\ \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348783,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348783\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21568627450980393,\n \"acc_stderr\": 0.01663931935031326,\n \
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.01663931935031326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.03106939026078943,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.03106939026078943\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.42531103718835517,\n\
\ \"mc2_stderr\": 0.014922459227887219\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330352\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt125m_10e5_1ep
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-20-39.939834.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- '**/details_harness|winogrande|5_2024-02-02T19-20-39.939834.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T19-20-39.939834.parquet'
- config_name: results
data_files:
- split: 2024_02_02T19_20_39.939834
path:
- results_2024-02-02T19-20-39.939834.parquet
- split: latest
path:
- results_2024-02-02T19-20-39.939834.parquet
---
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_1ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_1ep](https://huggingface.co/BFauber/opt125m_10e5_1ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:20:39.939834](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep/blob/main/results_2024-02-02T19-20-39.939834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26615727535166933,
"acc_stderr": 0.03094238339855658,
"acc_norm": 0.26747140124253277,
"acc_norm_stderr": 0.031765035154689494,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.42531103718835517,
"mc2_stderr": 0.014922459227887219
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702252,
"acc_norm": 0.23464163822525597,
"acc_norm_stderr": 0.01238387356076866
},
"harness|hellaswag|10": {
"acc": 0.2877912766381199,
"acc_stderr": 0.004518080594528024,
"acc_norm": 0.3090021907986457,
"acc_norm_stderr": 0.004611377019520816
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105655,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.19831223628691982,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.19831223628691982,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.022574519424174884,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.022574519424174884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287414,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287414
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348783,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.01663931935031326,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.01663931935031326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078943,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078943
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.42531103718835517,
"mc2_stderr": 0.014922459227887219
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330352
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bassie96code/train_test_valid_wettekst | ---
dataset_info:
features:
- name: tok_wettekst
sequence: string
- name: aantal tokens
dtype: int64
- name: label lijsten
sequence: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6272
num_examples: 10
download_size: 4886
dataset_size: 6272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "train_test_valid_wettekst"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orgcatorg/israel-hamas-gaza-cnbc | ---
dataset_info:
features:
- name: '@type'
dtype: string
- name: headline
dtype: string
- name: url
dtype: string
- name: dateModified
dtype: string
- name: datePublished
dtype: string
- name: mainEntityOfPage
dtype: string
- name: articleBody
dtype: string
- name: publisher
dtype: string
- name: image
dtype: string
- name: thumbnailUrl
dtype: string
- name: video
dtype: string
splits:
- name: train
num_bytes: 668826
num_examples: 335
download_size: 0
dataset_size: 668826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "israel-hamas-gaza-cnbc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
k3w15hu8h/MissBert-Data | ---
license: mit
---
|
GEM/CACAPO_E2E | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- nl
- en
tags:
- Reverse Engineered
- Dutch
- English
- RDF to sentence
- For End To End
pretty_name: Cacapo_E2E
size_categories:
- 10K<n<100K
---
The full dataset card is visible in the JSON file named "original_cacapo_for_e2e_models-02_13_2023_19_30_07", which has been made with GEMs second datacard creation GUI. |
pissack1234/tigo-tanzania-personal-data-2023 | ---
license: apache-2.0
---
|
Orenbac/dataset | ---
license: mit
---
|
AdapterOcean/oasst_top1_standardized_unified | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 22136590
num_examples: 12946
download_size: 13050831
dataset_size: 22136590
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_unified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_44 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 71206187
num_examples: 7080
download_size: 21088425
dataset_size: 71206187
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_44"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamjweintraut/eli5_lfqa | ---
dataset_info:
features:
- name: index
dtype: int64
- name: q_id
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: all_answers
sequence: string
- name: num_answers
dtype: int64
- name: context
dtype: string
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2524358932.8466535
num_examples: 183333
- name: test
num_bytes: 315550030.0766733
num_examples: 22917
- name: validation
num_bytes: 315550030.0766733
num_examples: 22917
download_size: 1900389956
dataset_size: 3155458993.0000005
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B | ---
pretty_name: Evaluation run of FelixChao/NarutoDolphin-10B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/NarutoDolphin-10B](https://huggingface.co/FelixChao/NarutoDolphin-10B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T12:12:30.168914](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B/blob/main/results_2024-01-14T12-12-30.168914.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6306583942825644,\n\
\ \"acc_stderr\": 0.03252627508388141,\n \"acc_norm\": 0.632276909104878,\n\
\ \"acc_norm_stderr\": 0.03317986227116511,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n\
\ \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038083\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n\
\ \"acc_stderr\": 0.0047463946133845325,\n \"acc_norm\": 0.841665006970723,\n\
\ \"acc_norm_stderr\": 0.0036430875292137216\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.01374079725857982,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.01374079725857982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n\
\ \"acc_stderr\": 0.0160943387684746,\n \"acc_norm\": 0.3642458100558659,\n\
\ \"acc_norm_stderr\": 0.0160943387684746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013014,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5912860013096678,\n\
\ \"mc2_stderr\": 0.015586868131613507\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \
\ \"acc_stderr\": 0.013524848894462115\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/NarutoDolphin-10B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|arc:challenge|25_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|gsm8k|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hellaswag|10_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T12-12-30.168914.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- '**/details_harness|winogrande|5_2024-01-14T12-12-30.168914.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T12-12-30.168914.parquet'
- config_name: results
data_files:
- split: 2024_01_14T12_12_30.168914
path:
- results_2024-01-14T12-12-30.168914.parquet
- split: latest
path:
- results_2024-01-14T12-12-30.168914.parquet
---
# Dataset Card for Evaluation run of FelixChao/NarutoDolphin-10B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/NarutoDolphin-10B](https://huggingface.co/FelixChao/NarutoDolphin-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T12:12:30.168914](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NarutoDolphin-10B/blob/main/results_2024-01-14T12-12-30.168914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6306583942825644,
"acc_stderr": 0.03252627508388141,
"acc_norm": 0.632276909104878,
"acc_norm_stderr": 0.03317986227116511,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038083
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.0047463946133845325,
"acc_norm": 0.841665006970723,
"acc_norm_stderr": 0.0036430875292137216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.01374079725857982,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.01374079725857982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.0160943387684746,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.0160943387684746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013014,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5912860013096678,
"mc2_stderr": 0.015586868131613507
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Kkoustubh/iPhone14Tweets | ---
license: cc
---
Approx 144K tweets about iPhone 14 |
open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b | ---
pretty_name: Evaluation run of MetaIX/GPT4-X-Alpasta-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MetaIX/GPT4-X-Alpasta-30b](https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:07:45.972235](https://huggingface.co/datasets/open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b/blob/main/results_2023-09-17T08-07-45.972235.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31312919463087246,\n\
\ \"em_stderr\": 0.00474940232599683,\n \"f1\": 0.4037961409395989,\n\
\ \"f1_stderr\": 0.0045737911370298204,\n \"acc\": 0.5434694672544375,\n\
\ \"acc_stderr\": 0.012140181814727365\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31312919463087246,\n \"em_stderr\": 0.00474940232599683,\n\
\ \"f1\": 0.4037961409395989,\n \"f1_stderr\": 0.0045737911370298204\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30477634571645185,\n \
\ \"acc_stderr\": 0.012679297549515406\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ }\n}\n```"
repo_url: https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_07_45.972235
path:
- '**/details_harness|drop|3_2023-09-17T08-07-45.972235.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-07-45.972235.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_07_45.972235
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-07-45.972235.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-07-45.972235.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:29:11.642048.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:29:11.642048.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_07_45.972235
path:
- '**/details_harness|winogrande|5_2023-09-17T08-07-45.972235.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-07-45.972235.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_29_11.642048
path:
- results_2023-07-19T22:29:11.642048.parquet
- split: 2023_09_17T08_07_45.972235
path:
- results_2023-09-17T08-07-45.972235.parquet
- split: latest
path:
- results_2023-09-17T08-07-45.972235.parquet
---
# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MetaIX/GPT4-X-Alpasta-30b](https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:07:45.972235](https://huggingface.co/datasets/open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b/blob/main/results_2023-09-17T08-07-45.972235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31312919463087246,
"em_stderr": 0.00474940232599683,
"f1": 0.4037961409395989,
"f1_stderr": 0.0045737911370298204,
"acc": 0.5434694672544375,
"acc_stderr": 0.012140181814727365
},
"harness|drop|3": {
"em": 0.31312919463087246,
"em_stderr": 0.00474940232599683,
"f1": 0.4037961409395989,
"f1_stderr": 0.0045737911370298204
},
"harness|gsm8k|5": {
"acc": 0.30477634571645185,
"acc_stderr": 0.012679297549515406
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Falah/side_profile_portraits_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1879316
num_examples: 10000
download_size: 248937
dataset_size: 1879316
---
# Dataset Card for "side_profile_portraits_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BrandonZYW/NYTClustering | ---
configs:
- config_name: location
data_files:
- split: test
path: location.csv
- config_name: topic
data_files:
- split: test
path: topic.csv
license: mit
--- |
IsraelAyo/SNET_Archive | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
- text2text-generation
language:
- en
pretty_name: SNET
size_categories:
- 100K<n<1M
--- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.