datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mrm8488/large_spanish_corpus_ds_tokenized_and_gropuped | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 16824296700
num_examples: 4103487
- name: test
num_bytes: 885489300
num_examples: 215973
download_size: 8311975924
dataset_size: 17709786000
---
# Dataset Card for "large_spanish_corpus_ds_tokenized_and_gropuped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_first_sent_train_10_eval_10_sentbefore | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 69119
num_examples: 50
- name: validation
num_bytes: 9130
num_examples: 10
download_size: 45538
dataset_size: 78249
---
# Dataset Card for "find_first_sent_train_10_eval_10_sentbefore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Prarabdha/Rick_and_Morty_Transcript | ---
license: mit
---
## Context
I got inspiration for this dataset from the [Rick&Morty Scripts](https://www.kaggle.com/datasets/andradaolteanu/rickmorty-scripts) by [Andrada Olteanu](https://www.kaggle.com/andradaolteanu) but felt like dataset was a little small and outdated
This dataset includes almost all the episodes till Season 5. More data will be updated
## Content
Rick and Morty Transcripts:
- index: index of the row
- speaker: the character's name
- dialogue: the dialogue of the character
## Acknowledgements
Thanks to the transcripts made available by
- [RickandMorty.fandom.com](https://rickandmorty.fandom.com/)
- [RickandMorty.newtfire.org](http://rickandmorty.newtfire.org/transcripts.html) |
BigScienceBiasEval/bias-shades | ---
license: cc-by-sa-4.0
language:
- ar
- en
- fr
- de
- hi
- ru
- es
- ta
---
Possibly a placeholder dataset for the original here: https://huggingface.co/datasets/bigscience-catalogue-data/bias-shades
# Data Statement for SHADES
> **How to use this document:**
> Fill in each section according to the instructions. Give as much detail as you can, but there's no need to extrapolate. The goal is to help people understand your data when they approach it. This could be someone looking at it in ten years, or it could be you yourself looking back at the data in two years.
> For full details, the best source is the original Data Statements paper, here: https://www.aclweb.org/anthology/Q18-1041/ .
> Instruction fields are given as blockquotes; delete the instructions when you're done, and provide the file with your data, for example as "DATASTATEMENT.md". The lists in some blocks are designed to be filled in, but it's good to also leave a written description of what's happening, as well as the list. It's fine to skip some fields if the information isn't known.
> Only blockquoted content should be deleted; the final about statement should be left intact.
Data set name: Bias-Shades
Citation (if available): TODO.
Data set developer(s): This dataset was compiled by dozens of research scientists through the BigScience open science collaboration. Collaborators, representing numerous cultures and languages, joined the project of their own volition.
Data statement author(s): Shayne Longpre, Aurélie Névéol, Shanya Sharma[Add name here if you add/edit the data statement :)].
Others who contributed to this document: N/A
License: Creative Commons Attribution-ShareAlike 4.0 (CC BY-SA 4.0).
## A. CURATION RATIONALE
> *Explanation.* Which texts were included and what were the goals in selecting texts, both in the original collection and in any further sub-selection? This can be especially important in datasets too large to thoroughly inspect by hand. An explicit statement of the curation rationale can help dataset users make inferences about what other kinds of texts systems trained with them could conceivably generalize to.
This dataset was curated by hand-crafting stereotype sentences by native speakers from the culture which is being targeted. An initial set of sentences was inferred from stereotypes expressed in the crowS-pairs data set(Nangia et al.). Native speakers first crafted templates for sentences expressing a stereotype. These templates are marked for gender and plurality of the target nouns, so the template can be reused by substituting different targets. Next, the template-target noun pair combinations were annotated for the veracity/reliability of the expressed stereotype. The resulting sentences express common and less common stereotypes in a variety of cultures and languages.
## B. LANGUAGE VARIETY/VARIETIES
> *Explanation.* Languages differ from each other in structural ways that can interact with NLP algorithms. Within a language, regional or social dialects can also show great variation (Chambers and Trudgill, 1998). The language and language variety should be described with a language tag from BCP-47 identifying the language variety (e.g., en-US or yue-Hant-HK), and a prose description of the language variety, glossing the BCP-47 tag and also providing further information (e.g., "English as spoken in Palo Alto, California", or "Cantonese written with traditional characters by speakers in Hong Kong who are bilingual in Mandarin").
* BCP-47 language tags: en-US, fr-FR, hi-IN, es-DO, ar-LY, ru-RU, de-DE, nl-NL, ta-IN.
* Language variety description: English spoken by native speakers of the United States, native French people from metropolitan France, native Hindi and Tamil speakers from India, Spanish speakers from the Dominican Republic, Arabic speakers from Libya, Russian speakers from Russia, German speakers from Germany, and Dutch speakers from the Netherlands.
## C. CONTRIBUTOR DEMOGRAPHIC
> ## C. SPEAKER DEMOGRAPHIC
> *Explanation.* Sociolinguistics has found that variation (in pronunciation, prosody, word choice, and grammar) correlates with speaker demographic characteristics (Labov, 1966), as speakers use linguistic variation to construct and project identities (Eckert and Rickford, 2001). Transfer from native languages (L1) can affect the language produced by non-native (L2) speakers (Ellis, 1994, Ch. 8). A further important type of variation is disordered speech (e.g., dysarthria). Specifications include:
Participants to the collection project were recruited through the HuggingFace BigScience project, and specifically the Bias and Fairness Evaluation group. Listed below.
Speakers:
* [ADD YOURSELF!]
* Shayne Longpre: English-speaking, male, 28 years old, culturally Canadian.
* Aurélie Névéol: French (native), English and Spanish speaking, female, 44 years old, culturally French (also familiar with American culture)
* Shanya Sharma: Hindi(native), English speaking, female, 24 years old, culturally Indian
* Margaret Mitchell: English, female, mid-30s, U.S.A.
* Maraim Masoud: Arabic, English Speaking female.
* Arjun Subramonian: English, Spanish, Tamil, non-binary, early-20s, USA, culturally Indian-American
## D. ANNOTATOR DEMOGRAPHIC
> *Explanation.* What are the demographic characteristics of the annotators and annotation guideline developers? Their own “social address” influences their experience with language and thus their perception of what they are annotating. Specifications include:
Participants to the collection project were recruited through the HuggingFace BigScience project, and specifically the Bias and Fairness Evaluation group. Speaker and annotator contributors listed in section C.
## E. SPEECH SITUATION
N/A
## F. TEXT CHARACTERISTICS
> *Explanation.* Both genre and topic influence the vocabulary and structural characteristics of texts (Biber, 1995), and should be specified.
Collected data is a collection of offensive stereotyped statements in numerous languages and cultures. They might be upsetting and/or offensive.
Along with these stereotyped statements are annotation judgements of how prevalent/real the expressed stereotypes are in the real world. Some statements were created from templates with substituted target nouns, and therefore may express an uncommon or unlikely stereotype.
## G. RECORDING QUALITY
N/A
## H. OTHER
> *Explanation.* There may be other information of relevance as well. Please use this space to develop any further categories that are relevant for your dataset.
## I. PROVENANCE APPENDIX
This initiative is part of the BigScience Workshop: https://bigscience.huggingface.co/.
## About this document
A data statement is a characterization of a dataset that provides context to allow developers and users to better understand how experimental results might generalize, how software might be appropriately deployed, and what biases might be reflected in systems built on the software.
Data Statements are from the University of Washington. Contact: [datastatements@uw.edu](mailto:datastatements@uw.edu). This document template is licensed as [CC0](https://creativecommons.org/share-your-work/public-domain/cc0/).
This version of the markdown Data Statement is from June 4th 2020. The Data Statement template is based on worksheets distributed at the [2020 LREC workshop on Data Statements](https://sites.google.com/uw.edu/data-statements-for-nlp/), by Emily M. Bender, Batya Friedman, and Angelina McMillan-Major. Adapted to community Markdown template by Leon Dercyznski. |
NYTK/HuSST | ---
annotations_creators:
- found
language_creators:
- found
- expert-generated
language:
- hu
license:
- bsd-2-clause
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- extended|other
task_categories:
- text-classification
task_ids:
- sentiment-classification
- sentiment-scoring
- text-scoring
pretty_name: HuSST
---
# Dataset Card for HuSST
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Language](#language)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
[HuSST dataset](https://github.com/nytud/HuSST)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
[lnnoemi](mailto:ligeti-nagy.noemi@nytud.hu)
### Dataset Summary
This is the dataset card for the Hungarian version of the Stanford Sentiment Treebank. This dataset which is also part of the Hungarian Language Understanding Evaluation Benchmark Kit [HuLU](hulu.nlp.nytud.hu). The corpus was created by translating and re-annotating the original SST (Roemmele et al., 2011).
### Supported Tasks and Leaderboards
'sentiment classification'
'sentiment scoring'
### Language
The BCP-47 code for Hungarian, the only represented language in this dataset, is hu-HU.
## Dataset Structure
### Data Instances
For each instance, there is an id, a sentence and a sentiment label.
An example:
```
{
"Sent_id": "dev_0",
"Sent": "Nos, a Jason elment Manhattanbe és a Pokolba kapcsán, azt hiszem, az elkerülhetetlen folytatások ötletlistájáról kihúzhatunk egy űrállomást 2455-ben (hé, ne lődd le a poént).",
"Label": "neutral"
}
```
### Data Fields
- Sent_id: unique id of the instances;
- Sent: the sentence, translation of an instance of the SST dataset;
- Label: "negative", "neutral", or "positive".
### Data Splits
HuSST has 3 splits: *train*, *validation* and *test*.
| Dataset split | Number of instances in the split |
|---------------|----------------------------------|
| train | 9344 |
| validation | 1168 |
| test | 1168 |
The test data is distributed without the labels. To evaluate your model, please [contact us](mailto:ligeti-nagy.noemi@nytud.hu), or check [HuLU's website](hulu.nlp.nytud.hu) for an automatic evaluation (this feature is under construction at the moment).
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The data is a translation of the content of the SST dataset (only the whole sentences were used). Each sentence was translated by a human translator. Each translation was manually checked and further refined by another annotator.
### Annotations
#### Annotation process
The translated sentences were annotated by three human annotators with one of the following labels: negative, neutral and positive. Each sentence was then curated by a fourth annotator (the 'curator'). The final label is the decision of the curator based on the three labels of the annotators.
#### Who are the annotators?
The translators were native Hungarian speakers with English proficiency. The annotators were university students with some linguistic background.
## Additional Information
### Licensing Information
### Citation Information
If you use this resource or any part of its documentation, please refer to:
Ligeti-Nagy, N., Ferenczi, G., Héja, E., Jelencsik-Mátyus, K., Laki, L. J., Vadász, N., Yang, Z. Gy. and Vadász, T. (2022) HuLU: magyar nyelvű benchmark adatbázis
kiépítése a neurális nyelvmodellek kiértékelése céljából [HuLU: Hungarian benchmark dataset to evaluate neural language models]. XVIII. Magyar Számítógépes Nyelvészeti Konferencia. pp. 431–446.
```
@inproceedings{ligetinagy2022hulu,
title={HuLU: magyar nyelvű benchmark adatbázis kiépítése a neurális nyelvmodellek kiértékelése céljából},
author={Ligeti-Nagy, N. and Ferenczi, G. and Héja, E. and Jelencsik-Mátyus, K. and Laki, L. J. and Vadász, N. and Yang, Z. Gy. and Vadász, T.},
booktitle={XVIII. Magyar Számítógépes Nyelvészeti Konferencia},
year={2022},
pages = {431--446}
}
```
and to:
Socher et al. (2013), Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. 1631--1642.
```
@inproceedings{socher-etal-2013-recursive,
title = "Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank",
author = "Socher, Richard and
Perelygin, Alex and
Wu, Jean and
Chuang, Jason and
Manning, Christopher D. and
Ng, Andrew and
Potts, Christopher",
booktitle = "Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing",
month = oct,
year = "2013",
address = "Seattle, Washington, USA",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D13-1170",
pages = "1631--1642",
}
```
### Contributions
Thanks to [lnnoemi](https://github.com/lnnoemi) for adding this dataset. |
Minata/src_fm_fc_ms_ff_method2testcases_v0 | ---
dataset_info:
features:
- name: src_fm_fc_ms_ff
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 844891690
num_examples: 322763
- name: test
num_bytes: 226163897
num_examples: 83535
download_size: 219410250
dataset_size: 1071055587
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jtatman/orca_mini_uncensored_squad_format_train | ---
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- question-answering
pretty_name: orca_mini_squad
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 118261864.35315199
num_examples: 67300
- name: test
num_bytes: 13140597.646848004
num_examples: 7478
download_size: 65276229
dataset_size: 131402462.0
---
# Dataset Card for "orca_mini_uncensored_squad_format_train"
## Dataset Description
Mostly purely an exercise in data extraction and formatting for dataset usage, and cross-model usage of data.
Uncensored data, because when everything is sanitized for alignment, the data may be "pure" but is no longer untimately realistic.
Part of an effort to create more question-answering friendly datasets that can be used for specialized domain training on small models.
### Dataset Summary
This is a "squad reformat" of an existing dataset located here: https://huggingface.co/datasets/julep-ai/orca_mini_uncensored
This could be swapped for squad format datasets for typical question-answering tasks with uncensored data from a partial pull of the mini-orca dataset here: psmathur/orca_minis_uncensored_dataset
### Supported Tasks and Leaderboards
- 'question-answering'
### Languages
The BCP-47 code for English as generally spoken in the United States is en-US and the BCP-47 code for English as generally spoken in the United Kingdom is en-GB. It is unknown if other varieties of English are represented in the data.
## Dataset Structure
Train and Test splits included
### Data Format
As in the squadv2 dataset, columns are: "id", "title", "context", "question", "answers": "text", "answer_start"
|
alignment/mm-cot | ---
license: apache-2.0
---
|
danielmalencar/quemSou | ---
dataset_info:
features:
- name: Context
dtype: float64
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 3091.3953488372094
num_examples: 30
- name: test
num_bytes: 1339.6046511627908
num_examples: 13
download_size: 6255
dataset_size: 4431.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
EstebanMax/lighthouse | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_yam-peleg__Experiment9-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment9-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment9-7B](https://huggingface.co/yam-peleg/Experiment9-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment9-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T00:42:08.192431](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment9-7B/blob/main/results_2024-02-12T00-42-08.192431.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.657124599246198,\n\
\ \"acc_stderr\": 0.03199285830904183,\n \"acc_norm\": 0.6581476060335769,\n\
\ \"acc_norm_stderr\": 0.03263815632071338,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7042270854773415,\n\
\ \"mc2_stderr\": 0.015001693034141303\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7125074686317466,\n\
\ \"acc_stderr\": 0.004516681953879087,\n \"acc_norm\": 0.880601473809998,\n\
\ \"acc_norm_stderr\": 0.003235941810943153\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n\
\ \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n\
\ \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n\
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.01666979959211203,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.01666979959211203\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7042270854773415,\n\
\ \"mc2_stderr\": 0.015001693034141303\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491902\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6376042456406369,\n \
\ \"acc_stderr\": 0.013240654263574767\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment9-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|arc:challenge|25_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|gsm8k|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hellaswag|10_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T00-42-08.192431.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- '**/details_harness|winogrande|5_2024-02-12T00-42-08.192431.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T00-42-08.192431.parquet'
- config_name: results
data_files:
- split: 2024_02_12T00_42_08.192431
path:
- results_2024-02-12T00-42-08.192431.parquet
- split: latest
path:
- results_2024-02-12T00-42-08.192431.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment9-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment9-7B](https://huggingface.co/yam-peleg/Experiment9-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment9-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T00:42:08.192431](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment9-7B/blob/main/results_2024-02-12T00-42-08.192431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.657124599246198,
"acc_stderr": 0.03199285830904183,
"acc_norm": 0.6581476060335769,
"acc_norm_stderr": 0.03263815632071338,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7042270854773415,
"mc2_stderr": 0.015001693034141303
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.01311904089772592
},
"harness|hellaswag|10": {
"acc": 0.7125074686317466,
"acc_stderr": 0.004516681953879087,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.003235941810943153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.01666979959211203,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.01666979959211203
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7042270854773415,
"mc2_stderr": 0.015001693034141303
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491902
},
"harness|gsm8k|5": {
"acc": 0.6376042456406369,
"acc_stderr": 0.013240654263574767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
crazylearners/p4demo | ---
license: cc-by-nc-sa-4.0
---
|
yjernite/prof_report__dreamlike-art-dreamlike-photoreal-2.0__multi__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: accountant
num_bytes: 1672
num_examples: 3
- name: aerospace_engineer
num_bytes: 1864
num_examples: 11
- name: aide
num_bytes: 1768
num_examples: 7
- name: air_conditioning_installer
num_bytes: 1696
num_examples: 4
- name: architect
num_bytes: 1696
num_examples: 4
- name: artist
num_bytes: 1936
num_examples: 14
- name: author
num_bytes: 1720
num_examples: 5
- name: baker
num_bytes: 1672
num_examples: 3
- name: bartender
num_bytes: 1672
num_examples: 3
- name: bus_driver
num_bytes: 1864
num_examples: 11
- name: butcher
num_bytes: 1744
num_examples: 6
- name: career_counselor
num_bytes: 1696
num_examples: 4
- name: carpenter
num_bytes: 1696
num_examples: 4
- name: carpet_installer
num_bytes: 1696
num_examples: 4
- name: cashier
num_bytes: 1792
num_examples: 8
- name: ceo
num_bytes: 1672
num_examples: 3
- name: childcare_worker
num_bytes: 1840
num_examples: 10
- name: civil_engineer
num_bytes: 1744
num_examples: 6
- name: claims_appraiser
num_bytes: 1696
num_examples: 4
- name: cleaner
num_bytes: 1888
num_examples: 12
- name: clergy
num_bytes: 1720
num_examples: 5
- name: clerk
num_bytes: 1792
num_examples: 8
- name: coach
num_bytes: 1648
num_examples: 2
- name: community_manager
num_bytes: 1768
num_examples: 7
- name: compliance_officer
num_bytes: 1720
num_examples: 5
- name: computer_programmer
num_bytes: 1720
num_examples: 5
- name: computer_support_specialist
num_bytes: 1816
num_examples: 9
- name: computer_systems_analyst
num_bytes: 1888
num_examples: 12
- name: construction_worker
num_bytes: 1720
num_examples: 5
- name: cook
num_bytes: 1720
num_examples: 5
- name: correctional_officer
num_bytes: 1816
num_examples: 9
- name: courier
num_bytes: 1720
num_examples: 5
- name: credit_counselor
num_bytes: 1720
num_examples: 5
- name: customer_service_representative
num_bytes: 1744
num_examples: 6
- name: data_entry_keyer
num_bytes: 1624
num_examples: 1
- name: dental_assistant
num_bytes: 1696
num_examples: 4
- name: dental_hygienist
num_bytes: 1672
num_examples: 3
- name: dentist
num_bytes: 1672
num_examples: 3
- name: designer
num_bytes: 1744
num_examples: 6
- name: detective
num_bytes: 1672
num_examples: 3
- name: director
num_bytes: 1672
num_examples: 3
- name: dishwasher
num_bytes: 1768
num_examples: 7
- name: dispatcher
num_bytes: 1672
num_examples: 3
- name: doctor
num_bytes: 1672
num_examples: 3
- name: drywall_installer
num_bytes: 1696
num_examples: 4
- name: electrical_engineer
num_bytes: 1816
num_examples: 9
- name: electrician
num_bytes: 1696
num_examples: 4
- name: engineer
num_bytes: 1696
num_examples: 4
- name: event_planner
num_bytes: 1672
num_examples: 3
- name: executive_assistant
num_bytes: 1696
num_examples: 4
- name: facilities_manager
num_bytes: 1720
num_examples: 5
- name: farmer
num_bytes: 1648
num_examples: 2
- name: fast_food_worker
num_bytes: 1912
num_examples: 13
- name: file_clerk
num_bytes: 1816
num_examples: 9
- name: financial_advisor
num_bytes: 1624
num_examples: 1
- name: financial_analyst
num_bytes: 1672
num_examples: 3
- name: financial_manager
num_bytes: 1672
num_examples: 3
- name: firefighter
num_bytes: 1696
num_examples: 4
- name: fitness_instructor
num_bytes: 1744
num_examples: 6
- name: graphic_designer
num_bytes: 1792
num_examples: 8
- name: groundskeeper
num_bytes: 1720
num_examples: 5
- name: hairdresser
num_bytes: 1768
num_examples: 7
- name: head_cook
num_bytes: 1720
num_examples: 5
- name: health_technician
num_bytes: 1816
num_examples: 9
- name: industrial_engineer
num_bytes: 1720
num_examples: 5
- name: insurance_agent
num_bytes: 1672
num_examples: 3
- name: interior_designer
num_bytes: 1792
num_examples: 8
- name: interviewer
num_bytes: 1744
num_examples: 6
- name: inventory_clerk
num_bytes: 1816
num_examples: 9
- name: it_specialist
num_bytes: 1648
num_examples: 2
- name: jailer
num_bytes: 1696
num_examples: 4
- name: janitor
num_bytes: 1768
num_examples: 7
- name: laboratory_technician
num_bytes: 1840
num_examples: 10
- name: language_pathologist
num_bytes: 1720
num_examples: 5
- name: lawyer
num_bytes: 1696
num_examples: 4
- name: librarian
num_bytes: 1696
num_examples: 4
- name: logistician
num_bytes: 1720
num_examples: 5
- name: machinery_mechanic
num_bytes: 1720
num_examples: 5
- name: machinist
num_bytes: 1648
num_examples: 2
- name: maid
num_bytes: 1744
num_examples: 6
- name: manager
num_bytes: 1696
num_examples: 4
- name: manicurist
num_bytes: 1720
num_examples: 5
- name: market_research_analyst
num_bytes: 1696
num_examples: 4
- name: marketing_manager
num_bytes: 1696
num_examples: 4
- name: massage_therapist
num_bytes: 1792
num_examples: 8
- name: mechanic
num_bytes: 1744
num_examples: 6
- name: mechanical_engineer
num_bytes: 1720
num_examples: 5
- name: medical_records_specialist
num_bytes: 1744
num_examples: 6
- name: mental_health_counselor
num_bytes: 1840
num_examples: 10
- name: metal_worker
num_bytes: 1696
num_examples: 4
- name: mover
num_bytes: 1864
num_examples: 11
- name: musician
num_bytes: 1744
num_examples: 6
- name: network_administrator
num_bytes: 1624
num_examples: 1
- name: nurse
num_bytes: 1648
num_examples: 2
- name: nursing_assistant
num_bytes: 1696
num_examples: 4
- name: nutritionist
num_bytes: 1648
num_examples: 2
- name: occupational_therapist
num_bytes: 1696
num_examples: 4
- name: office_clerk
num_bytes: 1744
num_examples: 6
- name: office_worker
num_bytes: 1768
num_examples: 7
- name: painter
num_bytes: 1696
num_examples: 4
- name: paralegal
num_bytes: 1768
num_examples: 7
- name: payroll_clerk
num_bytes: 1720
num_examples: 5
- name: pharmacist
num_bytes: 1768
num_examples: 7
- name: pharmacy_technician
num_bytes: 1792
num_examples: 8
- name: photographer
num_bytes: 1792
num_examples: 8
- name: physical_therapist
num_bytes: 1672
num_examples: 3
- name: pilot
num_bytes: 1744
num_examples: 6
- name: plane_mechanic
num_bytes: 1768
num_examples: 7
- name: plumber
num_bytes: 1696
num_examples: 4
- name: police_officer
num_bytes: 1744
num_examples: 6
- name: postal_worker
num_bytes: 1744
num_examples: 6
- name: printing_press_operator
num_bytes: 1816
num_examples: 9
- name: producer
num_bytes: 1696
num_examples: 4
- name: psychologist
num_bytes: 1720
num_examples: 5
- name: public_relations_specialist
num_bytes: 1672
num_examples: 3
- name: purchasing_agent
num_bytes: 1720
num_examples: 5
- name: radiologic_technician
num_bytes: 1816
num_examples: 9
- name: real_estate_broker
num_bytes: 1696
num_examples: 4
- name: receptionist
num_bytes: 1672
num_examples: 3
- name: repair_worker
num_bytes: 1720
num_examples: 5
- name: roofer
num_bytes: 1696
num_examples: 4
- name: sales_manager
num_bytes: 1624
num_examples: 1
- name: salesperson
num_bytes: 1672
num_examples: 3
- name: school_bus_driver
num_bytes: 1864
num_examples: 11
- name: scientist
num_bytes: 1744
num_examples: 6
- name: security_guard
num_bytes: 1672
num_examples: 3
- name: sheet_metal_worker
num_bytes: 1744
num_examples: 6
- name: singer
num_bytes: 1768
num_examples: 7
- name: social_assistant
num_bytes: 1816
num_examples: 9
- name: social_worker
num_bytes: 1816
num_examples: 9
- name: software_developer
num_bytes: 1648
num_examples: 2
- name: stocker
num_bytes: 1792
num_examples: 8
- name: supervisor
num_bytes: 1744
num_examples: 6
- name: taxi_driver
num_bytes: 1720
num_examples: 5
- name: teacher
num_bytes: 1744
num_examples: 6
- name: teaching_assistant
num_bytes: 1744
num_examples: 6
- name: teller
num_bytes: 1792
num_examples: 8
- name: therapist
num_bytes: 1792
num_examples: 8
- name: tractor_operator
num_bytes: 1672
num_examples: 3
- name: truck_driver
num_bytes: 1672
num_examples: 3
- name: tutor
num_bytes: 1816
num_examples: 9
- name: underwriter
num_bytes: 1720
num_examples: 5
- name: veterinarian
num_bytes: 1648
num_examples: 2
- name: welder
num_bytes: 1744
num_examples: 6
- name: wholesale_buyer
num_bytes: 1768
num_examples: 7
- name: writer
num_bytes: 1768
num_examples: 7
download_size: 632511
dataset_size: 253040
---
# Dataset Card for "prof_report__dreamlike-art-dreamlike-photoreal-2.0__multi__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csaybar/CloudSEN12-high | ---
license: cc-by-nc-4.0
---
# **CloudSEN12 HIGH-QUALITY**
## **A Benchmark Dataset for Cloud Semantic Understanding**

CloudSEN12 is a LARGE dataset (~1 TB) for cloud semantic understanding that consists of 49,400 image patches (IP) that are
evenly spread throughout all continents except Antarctica. Each IP covers 5090 x 5090 meters and contains data from Sentinel-2
levels 1C and 2A, hand-crafted annotations of thick and thin clouds and cloud shadows, Sentinel-1 Synthetic Aperture Radar (SAR),
digital elevation model, surface water occurrence, land cover classes, and cloud mask results from six cutting-edge
cloud detection algorithms.
CloudSEN12 is designed to support both weakly and self-/semi-supervised learning strategies by including three distinct forms of
hand-crafted labeling data: high-quality, scribble and no-annotation. For more details on how we created the dataset see our
paper.
Ready to start using **[CloudSEN12](https://cloudsen12.github.io/)**?
**[Download Dataset](https://cloudsen12.github.io/download.html)**
**[Paper - Scientific Data](https://www.nature.com/articles/s41597-022-01878-2)**
**[Inference on a new S2 image](https://colab.research.google.com/github/cloudsen12/examples/blob/master/example02.ipynb)**
**[Enter to cloudApp](https://github.com/cloudsen12/CloudApp)**
**[CloudSEN12 in Google Earth Engine](https://gee-community-catalog.org/projects/cloudsen12/)**
<br>
### **General Description**
<br>
| File | Name | Scale | Wavelength | Description | Datatype |
|---------------|-----------------|--------|------------------------------|------------------------------------------------------------------------------------------------------|----------|
| L1C_ & L2A_ | B1 | 0.0001 | 443.9nm (S2A) / 442.3nm (S2B)| Aerosols. | np.int16 |
| | B2 | 0.0001 | 496.6nm (S2A) / 492.1nm (S2B)| Blue. | np.int16 |
| | B3 | 0.0001 | 560nm (S2A) / 559nm (S2B) | Green. | np.int16 |
| | B4 | 0.0001 | 664.5nm (S2A) / 665nm (S2B) | Red. | np.int16 |
| | B5 | 0.0001 | 703.9nm (S2A) / 703.8nm (S2B)| Red Edge 1. | np.int16 |
| | B6 | 0.0001 | 740.2nm (S2A) / 739.1nm (S2B)| Red Edge 2. | np.int16 |
| | B7 | 0.0001 | 782.5nm (S2A) / 779.7nm (S2B)| Red Edge 3. | np.int16 |
| | B8 | 0.0001 | 835.1nm (S2A) / 833nm (S2B) | NIR. | np.int16 |
| | B8A | 0.0001 | 864.8nm (S2A) / 864nm (S2B) | Red Edge 4. | np.int16 |
| | B9 | 0.0001 | 945nm (S2A) / 943.2nm (S2B) | Water vapor. | np.int16 |
| | B11 | 0.0001 | 1613.7nm (S2A) / 1610.4nm (S2B)| SWIR 1. | np.int16 |
| | B12 | 0.0001 | 2202.4nm (S2A) / 2185.7nm (S2B)| SWIR 2. | np.int16 |
| L1C_ | B10 | 0.0001 | 1373.5nm (S2A) / 1376.9nm (S2B)| Cirrus. | np.int16 |
| L2A_ | AOT | 0.001 | - | Aerosol Optical Thickness. | np.int16 |
| | WVP | 0.001 | - | Water Vapor Pressure. | np.int16 |
| | TCI_R | 1 | - | True Color Image, Red. | np.int16 |
| | TCI_G | 1 | - | True Color Image, Green. | np.int16 |
| | TCI_B | 1 | - | True Color Image, Blue. | np.int16 |
| S1_ | VV | 1 | 5.405GHz | Dual-band cross-polarization, vertical transmit/horizontal receive. |np.float32|
| | VH | 1 | 5.405GHz | Single co-polarization, vertical transmit/vertical receive. |np.float32|
| | angle | 1 | - | Incidence angle generated by interpolating the ‘incidenceAngle’ property. |np.float32|
| EXTRA_ | CDI | 0.0001 | - | Cloud Displacement Index. | np.int16 |
| | Shwdirection | 0.01 | - | Azimuth. Values range from 0°- 360°. | np.int16 |
| | elevation | 1 | - | Elevation in meters. Obtained from MERIT Hydro datasets. | np.int16 |
| | ocurrence | 1 | - | JRC Global Surface Water. The frequency with which water was present. | np.int16 |
| | LC100 | 1 | - | Copernicus land cover product. CGLS-LC100 Collection 3. | np.int16 |
| | LC10 | 1 | - | ESA WorldCover 10m v100 product. | np.int16 |
| LABEL_ | fmask | 1 | - | Fmask4.0 cloud masking. | np.int16 |
| | QA60 | 1 | - | SEN2 Level-1C cloud mask. | np.int8 |
| | s2cloudless | 1 | - | sen2cloudless results. | np.int8 |
| | sen2cor | 1 | - | Scene Classification band. Obtained from SEN2 level 2A. | np.int8 |
| | cd_fcnn_rgbi | 1 | - | López-Puigdollers et al. results based on RGBI bands. | np.int8 |
| |cd_fcnn_rgbi_swir| 1 | - | López-Puigdollers et al. results based on RGBISWIR bands. | np.int8 |
| | kappamask_L1C | 1 | - | KappaMask results using SEN2 level L1C as input. | np.int8 |
| | kappamask_L2A | 1 | - | KappaMask results using SEN2 level L2A as input. | np.int8 |
| | manual_hq | 1 | | High-quality pixel-wise manual annotation. | np.int8 |
| | manual_sc | 1 | | Scribble manual annotation. | np.int8 |
<br>
### **Label Description**
| **CloudSEN12** | **KappaMask** | **Sen2Cor** | **Fmask** | **s2cloudless** | **CD-FCNN** | **QA60** |
|------------------|------------------|-------------------------|-----------------|-----------------------|---------------------|--------------------|
| 0 Clear | 1 Clear | 4 Vegetation | 0 Clear land | 0 Clear | 0 Clear | 0 Clear |
| | | 2 Dark area pixels | 1 Clear water | | | |
| | | 5 Bare Soils | 3 Snow | | | |
| | | 6 Water | | | | |
| | | 11 Snow | | | | |
| 1 Thick cloud | 4 Cloud | 8 Cloud medium probability | 4 Cloud | 1 Cloud | 1 Cloud | 1024 Opaque cloud |
| | | 9 Cloud high probability | | | | |
| 2 Thin cloud | 3 Semi-transparent cloud | 10 Thin cirrus | | | | 2048 Cirrus cloud |
| 3 Cloud shadow | 2 Cloud shadow | 3 Cloud shadows | 2 Cloud shadow | | | |
<br>
<be>
# **Dataset information, working with np.memmap:**
Sentinel-1 and Sentinel-2 collect images that span an area of 5090 x 5090 meters at 10 meters per pixel.
This results in 509 x 509 pixel images, presenting a challenge.
**Given each layer is a two-dimensional matrix, true image data is held from pixel (1,1) to (509,509)**
The subsequent images have been padded with three pixels around the image to make the images 512 x 512, a size that most models accept.
To give a visual representation of where the padding has been added:
x marks blank pixels stored as black (255)
xxxxxxxxxxxxxx
x xx
x xx
x xx
x xx
x xx
xxxxxxxxxxxxxx
xxxxxxxxxxxxxx
The effects of the padding can be mitigated by adding a random crop within (1,1) to (509, 509)
or completing a center crop to the desired size for network architecture.
### The current split of image data is into three categories:
- Training: 84.90 % of total
- Validation: 5.35 % of total
- Testing: 9.75 % of total
For the recomposition of the data to take random samples of all 10,000 available images,
we can combine the np.memmap objects and take random selections at the beginning of each trial,
selecting random samples of the 10,000 images based on the desired percentage of the total data available.
This approach ensures the mitigation of training bias based on the original selection of images for each category.
<br>
### **Example**
**train shape: (8490, 512, 512)**
<br>
**val shape: (535, 512, 512)**
<br>
**test shape: (975, 512, 512)**
<br>
```py
import numpy as np
# Read high-quality train
train_shape = (8490, 512, 512)
B4X = np.memmap('train/L1C_B04.dat', dtype='int16', mode='r', shape=train_shape)
y = np.memmap('train/manual_hq.dat', dtype='int8', mode='r', shape=train_shape)
# Read high-quality val
val_shape = (535, 512, 512)
B4X = np.memmap('val/L1C_B04.dat', dtype='int16', mode='r', shape=val_shape)
y = np.memmap('val/manual_hq.dat', dtype='int8', mode='r', shape=val_shape)
# Read high-quality test
test_shape = (975, 512, 512)
B4X = np.memmap('test/L1C_B04.dat', dtype='int16', mode='r', shape=test_shape)
y = np.memmap('test/manual_hq.dat', dtype='int8', mode='r', shape=test_shape)
```
<br>
This work has been partially supported by the Spanish Ministry of Science and Innovation project
PID2019-109026RB-I00 (MINECO-ERDF) and the Austrian Space Applications Programme within the
**[SemantiX project](https://austria-in-space.at/en/projects/2019/semantix.php)**.
|
tyzhu/squad_qa_num_v5_full_recite_full_passage | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 9118126
num_examples: 5070
- name: validation
num_bytes: 580808
num_examples: 300
download_size: 1769784
dataset_size: 9698934
---
# Dataset Card for "squad_qa_num_v5_full_recite_full_passage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jay401521/weibo_senti_test | ---
dataset_info:
features:
- name: label
dtype: int64
- name: review
dtype: string
splits:
- name: train
num_bytes: 3433361
num_examples: 20000
download_size: 2608855
dataset_size: 3433361
---
# Dataset Card for "weibo_senti_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
humane-lab/K-HATERS-Ratings | ---
license: cc-by-4.0
---
|
FVilmar/faabricio_silv | ---
license: openrail
---
|
Seanxh/twitter_dataset_1713144474 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20650
num_examples: 46
download_size: 12157
dataset_size: 20650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
3una/Fer2013 | ---
task_categories:
- image-classification
pretty_name: FER2013
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gryffindor-ISWS/dbpedia_abstracts_fictional_characters_with_img | ---
license: gpl-3.0
language:
- en
---
DBpedia Abstracts |
liuyanchen1015/MULTI_VALUE_sst2_too_sub | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 2406
num_examples: 20
- name: test
num_bytes: 5536
num_examples: 42
- name: train
num_bytes: 83543
num_examples: 857
download_size: 39073
dataset_size: 91485
---
# Dataset Card for "MULTI_VALUE_sst2_too_sub"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/50-baseline-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
splits:
- name: train
num_bytes: 86432169
num_examples: 15326
- name: test
num_bytes: 3068413
num_examples: 576
- name: validation
num_bytes: 3265707
num_examples: 576
download_size: 57468120
dataset_size: 92766289
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
NicoBelicoBRUS/Mario | ---
license: apache-2.0
---
|
InceptiveDev/user-dataset-replytexts | ---
license: mit
---
|
Kelvin878/gc10_det_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
- name: guide_with_background
dtype: image
splits:
- name: train
num_bytes: 546273024.124
num_examples: 1594
download_size: 545099494
dataset_size: 546273024.124
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
samfmn/guard | ---
license: mit
---
|
Raghunath007/ipl | ---
license: other
language:
- en
tags:
- ipl 2023
- ipl
- Indian premier League
- cricket
- Indian Cricket
- BCCI
size_categories:
- 100K<n<1M
--- |
dim/grammarly_coedit | ---
dataset_info:
features:
- name: _id
dtype: string
- name: task
dtype: string
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 19943349
num_examples: 82466
download_size: 11658767
dataset_size: 19943349
---
# Dataset Card for "grammarly_coedit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-Blaise-g__scitldr-89735e41-12705694 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/scitldr
eval_info:
task: summarization
model: Blaise-g/longt5_tglobal_large_explanatory_baseline_scitldr
metrics: ['bertscore']
dataset_name: Blaise-g/scitldr
dataset_config: Blaise-g--scitldr
dataset_split: test
col_mapping:
text: source
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/longt5_tglobal_large_explanatory_baseline_scitldr
* Dataset: Blaise-g/scitldr
* Config: Blaise-g--scitldr
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
sandy50422gmail/SelfTestDataset | ---
license: unknown
---
|
communityai/Telugu-LLM-Labs___urdu_alpaca_yahma_cleaned_filtered | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 55541899.0
num_examples: 28910
download_size: 23453422
dataset_size: 55541899.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vinnybustacap/Patents | ---
license: apache-2.0
---
|
SkyWR/DigoCaires | ---
license: openrail
---
|
iamshnoo/alpaca-cleaned-chinese | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 30759982
num_examples: 51760
download_size: 17896759
dataset_size: 30759982
---
Translated from yahma/alpaca-cleaned using NLLB-1.3B
# Dataset Card for "alpaca-cleaned-chinese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Binaryy/travel_sample_extended | ---
dataset_info:
features:
- name: query
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 203357
num_examples: 110
download_size: 109729
dataset_size: 203357
---
# Dataset Card for "travel_sample_extended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Noodlz__DolphinStar-12.5B | ---
pretty_name: Evaluation run of noodlz/DolphinStar-12.5B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [noodlz/DolphinStar-12.5B](https://huggingface.co/noodlz/DolphinStar-12.5B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_noodlz__DolphinStar-12.5B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T17:38:16.335466](https://huggingface.co/datasets/open-llm-leaderboard/details_noodlz__DolphinStar-12.5B/blob/main/results_2024-04-15T17-38-16.335466.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6020379089051342,\n\
\ \"acc_stderr\": 0.033053858660412175,\n \"acc_norm\": 0.6073661491404435,\n\
\ \"acc_norm_stderr\": 0.0337283660403956,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.515149606406234,\n\
\ \"mc2_stderr\": 0.015577931816775841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6281617207727545,\n\
\ \"acc_stderr\": 0.004823078145064965,\n \"acc_norm\": 0.8199561840270863,\n\
\ \"acc_norm_stderr\": 0.0038343870022708873\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932267,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932267\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531006,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531006\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042778,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042778\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.03210479051015776,\n \
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077795,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077795\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.01471168438613995,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.01471168438613995\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n\
\ \"acc_stderr\": 0.014968772435812145,\n \"acc_norm\": 0.2770949720670391,\n\
\ \"acc_norm_stderr\": 0.014968772435812145\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657115,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657115\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.515149606406234,\n\
\ \"mc2_stderr\": 0.015577931816775841\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.01196129890580315\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35405610310841545,\n \
\ \"acc_stderr\": 0.01317272838522258\n }\n}\n```"
repo_url: https://huggingface.co/noodlz/DolphinStar-12.5B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-24-00.527475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-38-16.335466.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-38-16.335466.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- '**/details_harness|winogrande|5_2024-04-15T17-24-00.527475.parquet'
- split: 2024_04_15T17_38_16.335466
path:
- '**/details_harness|winogrande|5_2024-04-15T17-38-16.335466.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T17-38-16.335466.parquet'
- config_name: results
data_files:
- split: 2024_04_15T17_24_00.527475
path:
- results_2024-04-15T17-24-00.527475.parquet
- split: 2024_04_15T17_38_16.335466
path:
- results_2024-04-15T17-38-16.335466.parquet
- split: latest
path:
- results_2024-04-15T17-38-16.335466.parquet
---
# Dataset Card for Evaluation run of noodlz/DolphinStar-12.5B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [noodlz/DolphinStar-12.5B](https://huggingface.co/noodlz/DolphinStar-12.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_noodlz__DolphinStar-12.5B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T17:38:16.335466](https://huggingface.co/datasets/open-llm-leaderboard/details_noodlz__DolphinStar-12.5B/blob/main/results_2024-04-15T17-38-16.335466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6020379089051342,
"acc_stderr": 0.033053858660412175,
"acc_norm": 0.6073661491404435,
"acc_norm_stderr": 0.0337283660403956,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.515149606406234,
"mc2_stderr": 0.015577931816775841
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938213
},
"harness|hellaswag|10": {
"acc": 0.6281617207727545,
"acc_stderr": 0.004823078145064965,
"acc_norm": 0.8199561840270863,
"acc_norm_stderr": 0.0038343870022708873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932267,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932267
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180276,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180276
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531006,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042778,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042778
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077795,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077795
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613995,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613995
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812145,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657115,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657115
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.515149606406234,
"mc2_stderr": 0.015577931816775841
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.01196129890580315
},
"harness|gsm8k|5": {
"acc": 0.35405610310841545,
"acc_stderr": 0.01317272838522258
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Yuma42__KangalKhan-DesolatingRuby-7B | ---
pretty_name: Evaluation run of Yuma42/KangalKhan-DesolatingRuby-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yuma42/KangalKhan-DesolatingRuby-7B](https://huggingface.co/Yuma42/KangalKhan-DesolatingRuby-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-DesolatingRuby-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T15:13:07.843791](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-DesolatingRuby-7B/blob/main/results_2024-02-22T15-13-07.843791.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6365320059867312,\n\
\ \"acc_stderr\": 0.032268560180940126,\n \"acc_norm\": 0.6381465769621488,\n\
\ \"acc_norm_stderr\": 0.03291355978147849,\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5705454564737066,\n\
\ \"mc2_stderr\": 0.015416440354205063\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.01413117676013117,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6702848038239394,\n\
\ \"acc_stderr\": 0.004691488813032163,\n \"acc_norm\": 0.8546106353316073,\n\
\ \"acc_norm_stderr\": 0.0035177257870177515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097424,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097424\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509986,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183873,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015053,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015053\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013007,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5705454564737066,\n\
\ \"mc2_stderr\": 0.015416440354205063\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.01155529528605928\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729817\n }\n}\n```"
repo_url: https://huggingface.co/Yuma42/KangalKhan-DesolatingRuby-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-13-07.843791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-13-07.843791.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- '**/details_harness|winogrande|5_2024-02-22T15-13-07.843791.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T15-13-07.843791.parquet'
- config_name: results
data_files:
- split: 2024_02_22T15_13_07.843791
path:
- results_2024-02-22T15-13-07.843791.parquet
- split: latest
path:
- results_2024-02-22T15-13-07.843791.parquet
---
# Dataset Card for Evaluation run of Yuma42/KangalKhan-DesolatingRuby-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-DesolatingRuby-7B](https://huggingface.co/Yuma42/KangalKhan-DesolatingRuby-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-DesolatingRuby-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T15:13:07.843791](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-DesolatingRuby-7B/blob/main/results_2024-02-22T15-13-07.843791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6365320059867312,
"acc_stderr": 0.032268560180940126,
"acc_norm": 0.6381465769621488,
"acc_norm_stderr": 0.03291355978147849,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.017087795881769625,
"mc2": 0.5705454564737066,
"mc2_stderr": 0.015416440354205063
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.01413117676013117,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.01375206241981783
},
"harness|hellaswag|10": {
"acc": 0.6702848038239394,
"acc_stderr": 0.004691488813032163,
"acc_norm": 0.8546106353316073,
"acc_norm_stderr": 0.0035177257870177515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097424,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097424
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183873,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015053,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015053
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013007,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013007
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.017087795881769625,
"mc2": 0.5705454564737066,
"mc2_stderr": 0.015416440354205063
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.01155529528605928
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CJWeiss/LexGenZero_billsum | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: input
dtype: string
- name: output
dtype: string
- name: fk_grade
dtype: float64
- name: cluster
dtype: string
- name: old_id
dtype: int64
splits:
- name: train
num_bytes: 81528
num_examples: 50
download_size: 48667
dataset_size: 81528
---
# Dataset Card for "LexGenZero_billsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
swadesh7/processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 3600
num_examples: 1
download_size: 4997
dataset_size: 3600
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skrishna/SeqSense_mcq_2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 16937
num_examples: 300
download_size: 4712
dataset_size: 16937
---
# Dataset Card for "SeqSense_mcq_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dnjdsxor21/nego-dialogue-53 | ---
dataset_info:
features:
- name: result
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: events
list:
- name: message
dtype: string
- name: role
dtype: string
- name: price
dtype: int64
splits:
- name: train
num_bytes: 73276
num_examples: 55
download_size: 23285
dataset_size: 73276
---
# Dataset Card for "nego-dialogue-53"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_98 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1259228576
num_examples: 245368
download_size: 1283245963
dataset_size: 1259228576
---
# Dataset Card for "chunk_98"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/m4singer_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 40151638.0
num_examples: 217
- name: academicodec_hifi_16k_320d
num_bytes: 40096637.0
num_examples: 217
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 40096637.0
num_examples: 217
- name: academicodec_hifi_24k_320d
num_bytes: 60154877.0
num_examples: 217
- name: audiodec_24k_320d
num_bytes: 60275237.0
num_examples: 217
- name: dac_16k
num_bytes: 40151855.0
num_examples: 217
- name: dac_24k
num_bytes: 60219467.0
num_examples: 217
- name: dac_44k
num_bytes: 110639439.0
num_examples: 217
- name: encodec_24k_12bps
num_bytes: 60219467.0
num_examples: 217
- name: encodec_24k_1_5bps
num_bytes: 60219467.0
num_examples: 217
- name: encodec_24k_24bps
num_bytes: 60219467.0
num_examples: 217
- name: encodec_24k_3bps
num_bytes: 60219467.0
num_examples: 217
- name: encodec_24k_6bps
num_bytes: 60219467.0
num_examples: 217
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 40143099.0
num_examples: 217
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 40143099.0
num_examples: 217
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 40151855.0
num_examples: 217
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 40151855.0
num_examples: 217
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 40151855.0
num_examples: 217
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 40151855.0
num_examples: 217
- name: speech_tokenizer_16k
num_bytes: 40206077.0
num_examples: 217
download_size: 1017913637
dataset_size: 1033982817.0
---
# Dataset Card for "m4singer_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlpie/Llama2-MedTuned-Instructions | ---
license: cc-by-nc-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 206029981
num_examples: 200252
- name: validation
num_bytes: 59653564
num_examples: 70066
download_size: 0
dataset_size: 265683545
---
# Dataset Card for "Llama2-MedTuned-Instructions"
## Dataset Description
Llama2-MedTuned-Instructions is an instruction-based dataset developed for training language models in biomedical NLP tasks. It consists of approximately 200,000 samples, each tailored to guide models in performing specific tasks such as Named Entity Recognition (NER), Relation Extraction (RE), and Medical Natural Language Inference (NLI). This dataset represents a fusion of various existing data sources, reformatted to facilitate instruction-based learning.
## Source Datasets and Composition
The dataset amalgamates training subsets from several prominent biomedical datasets:
- **Named Entity Recognition (NER)**: Utilises NCBI-disease, BC5CDR-disease, BC5CDR-chem, BC2GM, JNLPBA, and i2b2-2012 datasets.
- **Relation Extraction (RE)**: Incorporates i2b2-2010 dataset.
- **Natural Language Inference (NLI)**: Employs the MedNLI dataset.
- **Document Classification**: Uses the hallmarks of cancer (HoC) dataset.
- **Question Answering (QA)**: Includes samples from ChatDoctor and PMC-Llama-Instructions datasets.
## Prompting Strategy
Each sample in the dataset follows a three-part structure: Instruction, Input, and Output. This format ensures clarity in task directives and expected outcomes, aligning with the instruction-based training approach.
## Usage and Application
This dataset is ideal for training and evaluating models on biomedical NLP tasks, particularly those focused on understanding and processing medical and clinical text. It serves as a benchmark for assessing model performance in domain-specific tasks, comparing against established models like BioBERT and BioClinicalBERT.
## Acknowledgements
We extend our gratitude to all contributors and supporting institutions.
## Citation
For utilising this dataset in academic work or applications, please cite:
```bibtex
@misc{rohanian2023exploring,
title={Exploring the Effectiveness of Instruction Tuning in Biomedical Language Processing},
author={Omid Rohanian and Mohammadmahdi Nouriborji and David A. Clifton},
year={2023},
eprint={2401.00579},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
turkish-nlp-suite/Corona-mini | ---
language:
- tr
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- n<1K
task_categories:
- summarization
pretty_name: Corona-mini
---
# Dataset Card for turkish-nlp-suite/Corona-mini
## Dataset Description
- **Repository:** [Turkish Corona-mini corpus](https://github.com/turkish-nlp-suite/Corona-mini-dataset)
- **Paper:** [ACL link]()
- **Dataset:** Corona-mini
- **Domain:** Social Media
<img src="https://raw.githubusercontent.com/turkish-nlp-suite/.github/main/profile/corona-mini.png" width="20%" height="20%">
### Dataset Summary
This is a tiny Turkish corpus consisting of comments about Corona symptoms. The corpus is compiled from two Ekşisözlük headlines "covid-19 belirtileri" and "gün gün koronavirüs belirtileri":
https://eksisozluk.com/covid-19-belirtileri--6416646
https://eksisozluk.com/gun-gun-koronavirus-belirtileri--6757665
This corpus
- contains 178 raw, 175 processed comments
- all comments are in Turkish
- comes in 2 versions, raw and mildly processed.
For the processed version html tags, expressions in brackets and some other tags are removed.
if you want more information about how this dataset is crafted you can watch the playlist of my campaign "Turkish NLP with Duygu": [How to compile datasets](https://www.youtube.com/playlist?list=PLJTHlIwB8Vco4ONU_mCNOYIcVyFA9QrBr).
If you want to process this dataset with spaCy Turkish you can watch: [Recipes with spaCy Turkish](https://www.youtube.com/watch?v=w0WCkgCOzzw&list=PLJTHlIwB8VcoWxYHnsZOQCxWOraW42NBj)
### Dataset Instances
An instance of this dataset looks as follows:
```
{
"text": "beni sarsmayan belirtilerdir, 2 doz biontech aşılıyım, 2. doz üzerinden 5 aydan çok geçmişti cuma : ayın 12 si akşamı açık havada az üşümeye maruz kaldım."
}
```
### Data Split
| name |train|
|---------|----:|
|Corona-mini|175|
### Citation
This work is supported by Google Developer Experts Program. Part of Duygu 2022 Fall-Winter collection, "Turkish NLP with Duygu"/ "Duygu'yla Türkçe NLP". All rights reserved. If you'd like to use this dataset in your own work, please kindly cite [A Diverse Set of Freely Available Linguistic Resources for Turkish](https://aclanthology.org/2023.acl-long.768/) :
```
@inproceedings{altinok-2023-diverse,
title = "A Diverse Set of Freely Available Linguistic Resources for {T}urkish",
author = "Altinok, Duygu",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.768",
pages = "13739--13750",
abstract = "This study presents a diverse set of freely available linguistic resources for Turkish natural language processing, including corpora, pretrained models and education material. Although Turkish is spoken by a sizeable population of over 80 million people, Turkish linguistic resources for natural language processing remain scarce. In this study, we provide corpora to allow practitioners to build their own applications and pretrained models that would assist industry researchers in creating quick prototypes. The provided corpora include named entity recognition datasets of diverse genres, including Wikipedia articles and supplement products customer reviews. In addition, crawling e-commerce and movie reviews websites, we compiled several sentiment analysis datasets of different genres. Our linguistic resources for Turkish also include pretrained spaCy language models. To the best of our knowledge, our models are the first spaCy models trained for the Turkish language. Finally, we provide various types of education material, such as video tutorials and code examples, that can support the interested audience on practicing Turkish NLP. The advantages of our linguistic resources are three-fold: they are freely available, they are first of their kind, and they are easy to use in a broad range of implementations. Along with a thorough description of the resource creation process, we also explain the position of our resources in the Turkish NLP world.",
}
```
|
Meduzka/telegram_data_war_in_ukraine | ---
license: apache-2.0
dataset_info:
features:
- name: date
dtype: int64
- name: text_low
dtype: string
splits:
- name: train
num_bytes: 1115692580
num_examples: 433434
download_size: 524550062
dataset_size: 1115692580
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ThrustEra/videos | ---
license: mit
---
|
open-llm-leaderboard/details_flemmingmiguel__MDBX-7B | ---
pretty_name: Evaluation run of flemmingmiguel/MDBX-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [flemmingmiguel/MDBX-7B](https://huggingface.co/flemmingmiguel/MDBX-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MDBX-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T08:08:27.552111](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MDBX-7B/blob/main/results_2024-01-21T08-08-27.552111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655806438283324,\n\
\ \"acc_stderr\": 0.03200415575634736,\n \"acc_norm\": 0.6548887828373608,\n\
\ \"acc_norm_stderr\": 0.032676368096110006,\n \"mc1\": 0.5446756425948592,\n\
\ \"mc1_stderr\": 0.017433490102538758,\n \"mc2\": 0.6818712158396469,\n\
\ \"mc2_stderr\": 0.015135432675602247\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7108145787691695,\n\
\ \"acc_stderr\": 0.004524575892952949,\n \"acc_norm\": 0.8830910177255527,\n\
\ \"acc_norm_stderr\": 0.0032065512832573956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"\
acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5446756425948592,\n\
\ \"mc1_stderr\": 0.017433490102538758,\n \"mc2\": 0.6818712158396469,\n\
\ \"mc2_stderr\": 0.015135432675602247\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237422\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \
\ \"acc_stderr\": 0.012343803671422678\n }\n}\n```"
repo_url: https://huggingface.co/flemmingmiguel/MDBX-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|arc:challenge|25_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|gsm8k|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hellaswag|10_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T08-08-27.552111.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- '**/details_harness|winogrande|5_2024-01-21T08-08-27.552111.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T08-08-27.552111.parquet'
- config_name: results
data_files:
- split: 2024_01_21T08_08_27.552111
path:
- results_2024-01-21T08-08-27.552111.parquet
- split: latest
path:
- results_2024-01-21T08-08-27.552111.parquet
---
# Dataset Card for Evaluation run of flemmingmiguel/MDBX-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/MDBX-7B](https://huggingface.co/flemmingmiguel/MDBX-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MDBX-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T08:08:27.552111](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MDBX-7B/blob/main/results_2024-01-21T08-08-27.552111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655806438283324,
"acc_stderr": 0.03200415575634736,
"acc_norm": 0.6548887828373608,
"acc_norm_stderr": 0.032676368096110006,
"mc1": 0.5446756425948592,
"mc1_stderr": 0.017433490102538758,
"mc2": 0.6818712158396469,
"mc2_stderr": 0.015135432675602247
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068744,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.7108145787691695,
"acc_stderr": 0.004524575892952949,
"acc_norm": 0.8830910177255527,
"acc_norm_stderr": 0.0032065512832573956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5446756425948592,
"mc1_stderr": 0.017433490102538758,
"mc2": 0.6818712158396469,
"mc2_stderr": 0.015135432675602247
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237422
},
"harness|gsm8k|5": {
"acc": 0.7217589082638363,
"acc_stderr": 0.012343803671422678
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mehnaazasad/arxiv_astro_co_ga | ---
license: mit
task_categories:
- summarization
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for `arxiv_astro_co_ga`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a dataset consisting of titles and abstracts for all Cosmology and Galaxy Astrophysics arXiv articles to date (99,659 papers).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
```
{'title': 'Probing cluster formation under extreme conditions: massive star clusters in blue compact galaxies',
'abstract': ' The numerous and massive young star clusters in blue compact galaxies (BCGs) are used to investigate the properties of their hosts. We test whether BCGs follow claimed relations between cluster populations and their hosts, such as the the fraction of the total luminosity contributed by the clusters as function of the mean star formation rate density; the $V$ band luminosity of the brightest youngest cluster as related to the mean host star formation rate; and the cluster formation efficiency (i.e., the fraction of star formation happening in star clusters) versus the density of the SFR. We find that BCGs follow the trends, supporting a scenario where cluster formation and environmental properties of the host are correlated. They occupy, in all the diagrams, the regions of higher SFRs, as expected by the extreme nature of the starbursts operating in these systems. We find that the star clusters contribute almost to the 20 % of the UV luminosity of the hosts. We suggest that the BCG starburst environment has most likely favoured the compression and collapse of the giant molecular clouds, enhancing the local star formation efficiency, so that massive clusters have been formed. The estimated cluster formation efficiency supports this scenario. BCGs have a cluster formation efficiency comparable to luminous IR galaxies and spiral starburst nuclei (the averaged value is about 35 %) which is much higher than the 8 - 10 % reported for quiescent spirals and dwarf star-forming galaxies. '
}
```
### Data Fields
- `title`: Title of the paper
- `abstract`: The abstract of the paper
### Data Splits
This dataset has 3 splits: _train_, _validation_, and _test_. Below are the statistics for these splits.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 79,727 |
| Validation | 9966 |
| Test | 9966 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
The original dataset from which this subset was constructed can be found here: [Kaggle arXiv Dataset Homepage](https://www.kaggle.com/Cornell-University/arxiv).
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Various authors.
### Annotations
This dataset contains no annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
No author information included in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The original data is maintained by ArXiv, huge thanks to the team for building and maintaining that dataset.
### Licensing Information
The arxiv_astro_co_ga dataset version 1.0.0 is released under the [MIT License](https://mitsloan.mit.edu/licensing).
### Citation Information
```
@misc{clement2019arxiv,
title={On the Use of ArXiv as a Dataset},
author={Colin B. Clement and Matthew Bierbaum and Kevin P. O'Keeffe and Alexander A. Alemi},
year={2019},
eprint={1905.00075},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
### Contributions
[More Information Needed] |
ShenaoZ/0.0001_idpo_same_6iters_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: train_prefs_1
num_bytes: 89449992
num_examples: 10189
- name: test_prefs_1
num_bytes: 17788280
num_examples: 2000
- name: train_prefs_2
num_bytes: 85726365
num_examples: 10189
- name: test_prefs_2
num_bytes: 16675423
num_examples: 2000
- name: train_prefs_3
num_bytes: 85140737
num_examples: 10189
- name: test_prefs_3
num_bytes: 16674518
num_examples: 2000
- name: train_prefs_4
num_bytes: 85915223
num_examples: 10189
- name: test_prefs_4
num_bytes: 16763941
num_examples: 2000
download_size: 227854622
dataset_size: 414134479
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_3
path: data/train_prefs_3-*
- split: test_prefs_3
path: data/test_prefs_3-*
- split: train_prefs_4
path: data/train_prefs_4-*
- split: test_prefs_4
path: data/test_prefs_4-*
---
# Dataset Card for "0.0001_idpo_same_6iters_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16 | ---
pretty_name: Evaluation run of Mikael110/llama-2-13b-guanaco-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T06:46:55.405946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-10-15T06-46-55.405946.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893348,\n \"f1\": 0.0650419463087247,\n\
\ \"f1_stderr\": 0.0014141562591008796,\n \"acc\": 0.43250519246062497,\n\
\ \"acc_stderr\": 0.010503130855979311\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893348,\n\
\ \"f1\": 0.0650419463087247,\n \"f1_stderr\": 0.0014141562591008796\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \
\ \"acc_stderr\": 0.00882048549144247\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516153\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T06_46_55.405946
path:
- '**/details_harness|drop|3_2023-10-15T06-46-55.405946.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T06-46-55.405946.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T06_46_55.405946
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-46-55.405946.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-46-55.405946.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:22:01.485033.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T06_46_55.405946
path:
- '**/details_harness|winogrande|5_2023-10-15T06-46-55.405946.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T06-46-55.405946.parquet'
- config_name: results
data_files:
- split: 2023_07_24T14_22_01.485033
path:
- results_2023-07-24T14:22:01.485033.parquet
- split: 2023_10_15T06_46_55.405946
path:
- results_2023-10-15T06-46-55.405946.parquet
- split: latest
path:
- results_2023-10-15T06-46-55.405946.parquet
---
# Dataset Card for Evaluation run of Mikael110/llama-2-13b-guanaco-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikael110/llama-2-13b-guanaco-fp16](https://huggingface.co/Mikael110/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T06:46:55.405946](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikael110__llama-2-13b-guanaco-fp16/blob/main/results_2023-10-15T06-46-55.405946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893348,
"f1": 0.0650419463087247,
"f1_stderr": 0.0014141562591008796,
"acc": 0.43250519246062497,
"acc_stderr": 0.010503130855979311
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893348,
"f1": 0.0650419463087247,
"f1_stderr": 0.0014141562591008796
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.00882048549144247
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516153
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jellyShuang/MMCT | ---
license: mit
---
# Multi-Moving Camera Pedestrian Tracking with a New Dataset and Global Link Model
[](https://huggingface.co/datasets/jellyShuang/MMCT)
This repository contains the details of the dataset and the Pytorch implementation of the Paper:
[Multi-Moving Camera Pedestrian Tracking with a New Dataset and Global Link Model](##)
## Abstract
Ensuring driving safety for autonomous vehicles has become increasingly crucial, highlighting the need for systematic tracking of pedestrians on the road. Most vehicles are equipped with visual sensors, however, the large-scale visual dataset from different agents has not been well studied. Most of the multi-target multi-camera (MTMC) tracking systems are composed of two modules: single-camera tracking (SCT) and inter-camera tracking (ICT). To reliably coordinate between them, MTMC tracking has been a very complicated task, while tracking across multi-moving cameras makes it even more challenging. In this paper, we focus on multi-target multi-moving camera (MTMMC) tracking, which is attracting increasing attention from the research community. Observing there are few datasets for MTMMC tracking, we collect a new dataset, called Multi-Moving Camera Track (MMCT), which contains sequences under various driving scenarios. To address the common problems of identity switch easily faced by most existing SCT trackers, especially for moving cameras due to ego-motion between the camera and targets, a lightweight appearance-free global link model, called Linker, is proposed to mitigate the identity switch by associating two disjoint tracklets of the same target into a complete trajectory within the same camera. Incorporated with Linker, existing SCT trackers generally obtain a significant improvement. Moreover, a strong baseline approach of re-identification (Re-ID) is effectively incorporated to extract robust appearance features under varying surroundings for pedestrian association across moving cameras for ICT, resulting in a much improved MTMMC tracking system, which can constitute a step further towards coordinated mining of multiple moving cameras.
- **<a href="#des"> <u>Dataset Description</u>**</a>
- **<a href="#str"> <u>Dataset Structure</u>**</a>
- **<a href="#dow"> <u>Dataset Downloads</u>**</a>
## <a id="des">Dataset Description</a>
We collect data in 12 distinct scenarios: ''A', 'B', 'C',...'L''. Each scenario may include the interaction of two or three cameras on different cars. For example, scene A includes two sequences of `A-I` and `A-II`. There are 32 sequences in total.
### <a id="str">Dataset Structure</a>
```
MMCT
├── data
│ ├── gps
│ └── labelS
└── images
├── 1
│ ├── A
│ │ ├── IMG_0098-frag-s1-a-fps5.mp4
│ │ └── jpg
│ └── C
│ ├── IMG_0559-frag-s1-c-fps5.mp4
│ ├── jpg
├── 2
│ ├── A
│ │ ├── IMG_0094-frag-s2-a-fps5.mp4
│ │ ├── jpg
│ ├── B
│ │ ├── IMG_2248-frag-s2-b-fps5.mp4
│ │ ├── jpg
...
├── 12
│ ├── A
│ │ ├── IMG_0104-frag-s12-a-fps5.mp4
│ │ ├── jpg
│ ├── B
│ │ ├── IMG_2254-frag-s12-b-fps5.mp4
│ │ ├── jpg
│ └── C
│ ├── IMG_0569-frag-s12-c-fps5.mp4
│ ├── jpg
```
### <a id="dow">Dataset Downloads</a>
The whole dataset can be downloaded from [Huggingface](https://huggingface.co/datasets/jellyShuang/MMCT). **Note that each file needs to unzip by the password. You can decompress each `.zip` file in its folder after sending us (2212534@mail.dhu.edu.cn, ytzhang@dhu.edu.cn) the [LICENSE](https://github.com/shengyuhao/DIVOTrack/blob/main/LICENSE.md). in any format.**
## <a id="ref">Reference</a>
The license agreement for data usage implies the citation of the paper above. Please notice that citing the dataset URL instead of the publications would not be compliant with this license agreement. You can read the LICENSE from [LICENSE](https://github.com/dhu-mmct/DHU-MMCT/blob/main/LICENSE.md).
## <a id="con">Contact</a>
If you have any concerns, please contact [2212534@mail.dhu.edu.cn](2212534@mail.dhu.edu.cn)
|
Freela/zeteste | ---
license: openrail
---
|
KaiserML/SemanticScholarAbstracts | ---
dataset_info:
features:
- name: corpusid
dtype: int64
- name: openaccessinfo
struct:
- name: externalids
struct:
- name: ACL
dtype: string
- name: ArXiv
dtype: string
- name: DOI
dtype: string
- name: MAG
dtype: string
- name: PubMedCentral
dtype: string
- name: license
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: abstract
dtype: string
- name: updated
dtype: string
splits:
- name: train
num_bytes: 59461773143.463005
num_examples: 48314588
download_size: 37596463269
dataset_size: 59461773143.463005
---
# Dataset Card for "SemanticScholarAbstracts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ravi21/R-awesome-prompts-chatbots | ---
license: creativeml-openrail-m
task_categories:
- translation
- question-answering
- summarization
tags:
- code
size_categories:
- n<1K
---
prompts and prompt engineering are essential for guiding language models, enabling control over outputs, generating desired content, fostering creativity,
and enhancing the overall user experience. They form a critical component in the interaction between users and AI systems,
ensuring meaningful and contextually appropriate conversations. This is one of the inspiration behind this dataset.
In this dataset we generated this prompts samples by various chatbots and few from Bard and from ChatGpt.
the main intention and idea behind that is 1) Prompt Engineering 2) Rich data .
This type of few samples of prompt which for helpful for training various
generative ai applications.but in this dataset the prompts samples are low amount .but you generate synthetic data from that . |
kishanbodybrain/test-fhir | ---
dataset_info:
features:
- name: fhir
dtype: string
- name: note
dtype: string
splits:
- name: train
num_bytes: 7258577
num_examples: 2726
download_size: 2264600
dataset_size: 7258577
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_tr_conf_gpt2_bestscore | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 0
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt2_bestscore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itisarainyday/notation | ---
dataset_info:
features:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 398489
num_examples: 457
- name: validation
num_bytes: 4333
num_examples: 5
download_size: 114401
dataset_size: 402822
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
man4j/ada_v3 | ---
dataset_info:
features:
- name: instruct
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 235764.0
num_examples: 169
download_size: 41722
dataset_size: 235764.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yasbok/Alpaca_arabic_instruct | ---
language: ar
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 28245695
num_examples: 52002
download_size: 14716254
dataset_size: 28245695
---
# Dataset Card for "Alpaca_arabic_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigcode/MultiPL-E-completions | ---
pretty_name: MultiPL-E generated programs and execution results
dataset_info:
features:
- name: experiment
dtype: string
- name: problem
dtype: string
- name: language
dtype: string
- name: top_p
dtype: float64
- name: max_tokens
dtype: int64
- name: prompt
dtype: string
- name: tests
dtype: string
- name: stop_tokens
sequence: string
- name: completions
sequence: string
- name: programs
sequence: string
- name: stdouts
sequence: string
- name: stderrs
sequence: string
- name: exit_codes
sequence: int64
- name: statuses
sequence: string
- name: timestamps
sequence: int64
splits:
- name: humaneval.py.bigcode_15b_800m.0.2.reworded
num_bytes: 50941974
num_examples: 161
- name: humaneval.py.bigcode_15b_200m.0.2.reworded
num_bytes: 57850786
num_examples: 161
- name: humaneval.py.bigcode_15b_400m.0.2.reworded
num_bytes: 52404545
num_examples: 161
- name: humaneval.py.bigcode_15b_600m.0.2.reworded
num_bytes: 55071293
num_examples: 161
- name: humaneval.rkt.bigcode_15b_800m.0.2.reworded
num_bytes: 77194321
num_examples: 161
- name: humaneval.py.chatgpt.0.2.reworded
num_bytes: 5473126
num_examples: 161
- name: humaneval.r.bigcode_15b_800m.0.2.reworded
num_bytes: 73160389
num_examples: 161
- name: humaneval.r.bigcode_15b_1000m.0.2.reworded
num_bytes: 63088872
num_examples: 161
- name: humaneval.r.bigcode_15b_200m.0.2.reworded
num_bytes: 77532102
num_examples: 161
- name: humaneval.r.bigcode_15b_400m.0.2.reworded
num_bytes: 80103073
num_examples: 161
- name: humaneval.r.bigcode_15b_600m.0.2.reworded
num_bytes: 76123407
num_examples: 161
- name: humaneval.py.bigcode_15b_1000m.0.2.reworded
num_bytes: 47282373
num_examples: 161
- name: humaneval.py.bigcode_15b_1000m.0.1.reworded
num_bytes: 11724023
num_examples: 161
- name: humaneval.py.bigcode_15b_1000m.0.4.reworded
num_bytes: 12167610
num_examples: 161
- name: humaneval.py.bigcode_15b_1000m.0.6.reworded
num_bytes: 12344190
num_examples: 161
- name: humaneval.py.bigcode_15b_1000m.0.8.reworded
num_bytes: 12825651
num_examples: 161
- name: humaneval.py.codegeex.0.2.reworded
num_bytes: 49547494
num_examples: 161
- name: humaneval.py.codegen.0.2.reworded
num_bytes: 55391634
num_examples: 161
- name: humaneval.py.Salesforce_codegen_16B_mono.0.2.reworded
num_bytes: 54757013
num_examples: 161
- name: humaneval.py.cushman001.0.2.reworded
num_bytes: 5081696
num_examples: 161
- name: humaneval.js.pystarcoder2.0.2.reworded
num_bytes: 6784781
num_examples: 161
- name: humaneval.py.santacoder.0.2.reworded
num_bytes: 57098223
num_examples: 161
- name: humaneval.py.santacoder_fim_prompt.0.2.reworded
num_bytes: 5473782
num_examples: 161
- name: humaneval.lua.bigcode_15b_1000m.0.2.reworded
num_bytes: 53170918
num_examples: 161
- name: mbpp.py.bigcode_15b_1000m.0.2.reworded
num_bytes: 76438518
num_examples: 397
- name: mbpp.py.pystarcoder2.0.2.reworded
num_bytes: 78424728
num_examples: 397
- name: mbpp.lua.davinci.0.2.reworded
num_bytes: 82574073
num_examples: 401
- name: humaneval.js.davinci.0.2.transform
num_bytes: 61113074
num_examples: 161
- name: humaneval.py.davinci.0.2.transform
num_bytes: 46653237
num_examples: 161
- name: humaneval.ts.davinci.0.2.transform
num_bytes: 60332695
num_examples: 159
- name: humaneval.ts.davinci.0.2.reworded
num_bytes: 60256365
num_examples: 159
- name: humaneval.cpp.davinci.0.2.reworded
num_bytes: 67044215
num_examples: 159
- name: humaneval.cs.davinci.0.2.reworded
num_bytes: 103369582
num_examples: 156
- name: humaneval.d.davinci.0.2.reworded
num_bytes: 66641398
num_examples: 156
- name: humaneval.go.davinci.0.2.reworded
num_bytes: 71050586
num_examples: 154
- name: humaneval.java.davinci.0.2.reworded
num_bytes: 71969268
num_examples: 156
- name: humaneval.jl.davinci.0.2.reworded
num_bytes: 76515733
num_examples: 159
- name: humaneval.js.davinci.0.2.reworded
num_bytes: 61674621
num_examples: 161
- name: humaneval.lua.davinci.0.2.reworded
num_bytes: 54016568
num_examples: 161
- name: humaneval.php.davinci.0.2.reworded
num_bytes: 61403511
num_examples: 161
- name: humaneval.pl.davinci.0.2.reworded
num_bytes: 60281076
num_examples: 161
- name: humaneval.py.davinci.0.2.reworded
num_bytes: 48223052
num_examples: 161
- name: humaneval.rb.davinci.0.2.reworded
num_bytes: 65236003
num_examples: 161
- name: humaneval.r.davinci.0.2.reworded
num_bytes: 48031115
num_examples: 161
- name: humaneval.rkt.davinci.0.2.reworded
num_bytes: 66273931
num_examples: 161
- name: humaneval.rs.davinci.0.2.reworded
num_bytes: 56522566
num_examples: 156
- name: humaneval.scala.davinci.0.2.reworded
num_bytes: 84153007
num_examples: 158
- name: humaneval.sh.davinci.0.2.reworded
num_bytes: 58089427
num_examples: 158
- name: humaneval.swift.davinci.0.2.reworded
num_bytes: 61663115
num_examples: 158
- name: humaneval.java.bigcode_15b_1000m.0.2.reworded
num_bytes: 75580922
num_examples: 158
- name: humaneval.java.bigcode_15b_200m.0.2.reworded
num_bytes: 84307515
num_examples: 158
- name: humaneval.java.bigcode_15b_400m.0.2.reworded
num_bytes: 85145220
num_examples: 158
- name: humaneval.java.bigcode_15b_600m.0.2.reworded
num_bytes: 80223655
num_examples: 158
- name: humaneval.java.bigcode_15b_800m.0.2.reworded
num_bytes: 78345454
num_examples: 158
- name: humaneval.java.codegeex.0.2.reworded
num_bytes: 74859734
num_examples: 158
- name: humaneval.java.codegen.0.2.reworded
num_bytes: 76705002
num_examples: 158
- name: humaneval.java.cushman001.0.2.reworded
num_bytes: 7860926
num_examples: 158
- name: humaneval.java.replit_code.0.2.reworded
num_bytes: 20396812
num_examples: 158
- name: humaneval.java.pystarcoder2.0.2.reworded
num_bytes: 73528078
num_examples: 158
- name: humaneval.java.davinci.0.2.keep
num_bytes: 71389774
num_examples: 161
- name: humaneval.java.davinci.0.2.remove
num_bytes: 64493441
num_examples: 157
- name: humaneval.java.davinci.0.2.transform
num_bytes: 72660740
num_examples: 156
- name: humaneval.java.davinci.0.8.keep
num_bytes: 72358729
num_examples: 161
- name: humaneval.java.davinci.0.8.reworded
num_bytes: 75043862
num_examples: 156
- name: humaneval.java.incoder.0.2.keep
num_bytes: 110176353
num_examples: 161
- name: humaneval.java.incoder.0.2.remove
num_bytes: 88903768
num_examples: 157
- name: humaneval.java.incoder.0.2.reworded
num_bytes: 109021885
num_examples: 156
- name: humaneval.java.incoder.0.2.transform
num_bytes: 107183302
num_examples: 156
- name: humaneval.java.incoder.0.8.keep
num_bytes: 75299144
num_examples: 139
- name: humaneval.java.incoder.0.8.reworded
num_bytes: 100533855
num_examples: 158
- name: mbpp.java.codegen.0.2.reworded
num_bytes: 144592215
num_examples: 373
- name: mbpp.java.codegen.0.8.reworded
num_bytes: 47521423
num_examples: 120
- name: mbpp.java.davinci.0.2.keep
num_bytes: 135567713
num_examples: 373
- name: mbpp.java.davinci.0.2.reworded
num_bytes: 136848151
num_examples: 373
- name: mbpp.java.incoder.0.2.reworded
num_bytes: 225046095
num_examples: 373
- name: mbpp.java.incoder.0.8.reworded
num_bytes: 167923488
num_examples: 373
- name: humaneval.cpp.davinci.0.2.keep
num_bytes: 63794632
num_examples: 161
- name: humaneval.cpp.davinci.0.2.remove
num_bytes: 58355394
num_examples: 158
- name: humaneval.cpp.davinci.0.2.transform
num_bytes: 66852210
num_examples: 159
- name: humaneval.cpp.davinci.0.8.keep
num_bytes: 61668425
num_examples: 161
- name: humaneval.cpp.bigcode_15b_1000m.0.2.reworded
num_bytes: 67353068
num_examples: 161
- name: humaneval.cpp.bigcode_15b_200m.0.2.reworded
num_bytes: 73914809
num_examples: 161
- name: humaneval.cpp.bigcode_15b_400m.0.2.reworded
num_bytes: 68514672
num_examples: 161
- name: humaneval.cpp.bigcode_15b_600m.0.2.reworded
num_bytes: 70059227
num_examples: 161
- name: humaneval.cpp.bigcode_15b_800m.0.2.reworded
num_bytes: 69289473
num_examples: 161
- name: humaneval.cpp.codegeex.0.2.reworded
num_bytes: 70250543
num_examples: 161
- name: humaneval.cpp.codegen.0.2.reworded
num_bytes: 65355449
num_examples: 161
- name: humaneval.cpp.cushman001.0.2.reworded
num_bytes: 6878097
num_examples: 161
- name: humaneval.cpp.replit_code.0.2.reworded
num_bytes: 18647873
num_examples: 161
- name: humaneval.cs.bigcode_15b_1000m.0.2.reworded
num_bytes: 115265463
num_examples: 158
- name: humaneval.cs.bigcode_15b_200m.0.2.reworded
num_bytes: 128116325
num_examples: 158
- name: humaneval.cs.bigcode_15b_400m.0.2.reworded
num_bytes: 116443233
num_examples: 158
- name: humaneval.cs.bigcode_15b_600m.0.2.reworded
num_bytes: 110736924
num_examples: 158
- name: humaneval.cs.bigcode_15b_800m.0.2.reworded
num_bytes: 116921504
num_examples: 158
- name: humaneval.cs.codegeex.0.2.reworded
num_bytes: 108831398
num_examples: 158
- name: humaneval.cs.codegen.0.2.reworded
num_bytes: 115085420
num_examples: 158
- name: humaneval.cs.cushman001.0.2.reworded
num_bytes: 11455476
num_examples: 158
- name: humaneval.cs.replit_code.0.2.reworded
num_bytes: 29978496
num_examples: 158
- name: humaneval.d.bigcode_15b_1000m.0.2.reworded
num_bytes: 69856838
num_examples: 156
- name: humaneval.d.bigcode_15b_200m.0.2.reworded
num_bytes: 69168908
num_examples: 156
- name: humaneval.d.bigcode_15b_400m.0.2.reworded
num_bytes: 66130665
num_examples: 156
- name: humaneval.d.bigcode_15b_600m.0.2.reworded
num_bytes: 60081870
num_examples: 156
- name: humaneval.d.bigcode_15b_800m.0.2.reworded
num_bytes: 68285500
num_examples: 156
- name: humaneval.d.codegeex.0.2.reworded
num_bytes: 67554723
num_examples: 156
- name: humaneval.d.codegen.0.2.reworded
num_bytes: 69538065
num_examples: 156
- name: humaneval.d.cushman001.0.2.reworded
num_bytes: 6543145
num_examples: 156
- name: humaneval.d.replit_code.0.2.reworded
num_bytes: 19332975
num_examples: 156
- name: humaneval.go.bigcode_15b_1000m.0.2.reworded
num_bytes: 75966586
num_examples: 154
- name: humaneval.go.bigcode_15b_200m.0.2.reworded
num_bytes: 90496893
num_examples: 154
- name: humaneval.go.bigcode_15b_400m.0.2.reworded
num_bytes: 80263304
num_examples: 154
- name: humaneval.go.bigcode_15b_600m.0.2.reworded
num_bytes: 80653936
num_examples: 154
- name: humaneval.go.bigcode_15b_800m.0.2.reworded
num_bytes: 79636433
num_examples: 154
- name: humaneval.go.codegeex.0.2.reworded
num_bytes: 74466402
num_examples: 154
- name: humaneval.go.codegen.0.2.reworded
num_bytes: 82565036
num_examples: 154
- name: humaneval.go.cushman001.0.2.reworded
num_bytes: 7919252
num_examples: 154
- name: humaneval.go.replit_code.0.2.reworded
num_bytes: 21740421
num_examples: 154
- name: humaneval.jl.bigcode_15b_1000m.0.2.reworded
num_bytes: 64541752
num_examples: 159
- name: humaneval.jl.bigcode_15b_200m.0.2.reworded
num_bytes: 64272523
num_examples: 159
- name: humaneval.jl.bigcode_15b_400m.0.2.reworded
num_bytes: 84674386
num_examples: 159
- name: humaneval.jl.bigcode_15b_600m.0.2.reworded
num_bytes: 83951098
num_examples: 159
- name: humaneval.jl.bigcode_15b_800m.0.2.reworded
num_bytes: 71891875
num_examples: 159
- name: humaneval.jl.codegeex.0.2.reworded
num_bytes: 49376484
num_examples: 159
- name: humaneval.jl.codegen.0.2.reworded
num_bytes: 49686685
num_examples: 159
- name: humaneval.jl.cushman001.0.2.reworded
num_bytes: 5594623
num_examples: 159
- name: humaneval.jl.replit_code.0.2.reworded
num_bytes: 12432167
num_examples: 159
- name: humaneval.js.bigcode_15b_1000m.0.2.reworded
num_bytes: 63930510
num_examples: 161
- name: humaneval.js.bigcode_15b_200m.0.2.reworded
num_bytes: 71006276
num_examples: 161
- name: humaneval.js.bigcode_15b_400m.0.2.reworded
num_bytes: 71141641
num_examples: 161
- name: humaneval.js.bigcode_15b_600m.0.2.reworded
num_bytes: 66406645
num_examples: 161
- name: humaneval.js.bigcode_15b_800m.0.2.reworded
num_bytes: 65906688
num_examples: 161
- name: humaneval.js.codegeex.0.2.reworded
num_bytes: 68965171
num_examples: 161
- name: humaneval.js.codegen.0.2.reworded
num_bytes: 71850674
num_examples: 161
- name: humaneval.js.cushman001.0.2.reworded
num_bytes: 6756809
num_examples: 161
- name: humaneval.js.replit_code.0.2.reworded
num_bytes: 20658701
num_examples: 161
- name: humaneval.lua.bigcode_15b_200m.0.2.reworded
num_bytes: 56733662
num_examples: 161
- name: humaneval.lua.bigcode_15b_400m.0.2.reworded
num_bytes: 57525953
num_examples: 161
- name: humaneval.lua.bigcode_15b_600m.0.2.reworded
num_bytes: 53575875
num_examples: 161
- name: humaneval.lua.bigcode_15b_800m.0.2.reworded
num_bytes: 54309789
num_examples: 161
- name: humaneval.lua.codegeex.0.2.reworded
num_bytes: 53766400
num_examples: 161
- name: humaneval.lua.codegen.0.2.reworded
num_bytes: 63642889
num_examples: 161
- name: humaneval.lua.cushman001.0.2.reworded
num_bytes: 5726991
num_examples: 161
- name: humaneval.lua.replit_code.0.2.reworded
num_bytes: 14458988
num_examples: 161
- name: humaneval.php.bigcode_15b_1000m.0.2.reworded
num_bytes: 62087493
num_examples: 161
- name: humaneval.php.bigcode_15b_200m.0.2.reworded
num_bytes: 67992787
num_examples: 161
- name: humaneval.php.bigcode_15b_400m.0.2.reworded
num_bytes: 65415347
num_examples: 161
- name: humaneval.php.bigcode_15b_600m.0.2.reworded
num_bytes: 64025429
num_examples: 161
- name: humaneval.php.bigcode_15b_800m.0.2.reworded
num_bytes: 67914229
num_examples: 161
- name: humaneval.php.codegeex.0.2.reworded
num_bytes: 63599818
num_examples: 161
- name: humaneval.php.codegen.0.2.reworded
num_bytes: 71759630
num_examples: 161
- name: humaneval.php.cushman001.0.2.reworded
num_bytes: 6680669
num_examples: 161
- name: humaneval.php.replit_code.0.2.reworded
num_bytes: 18347062
num_examples: 161
- name: humaneval.pl.bigcode_15b_1000m.0.2.reworded
num_bytes: 69839042
num_examples: 161
- name: humaneval.pl.bigcode_15b_200m.0.2.reworded
num_bytes: 79671308
num_examples: 161
- name: humaneval.pl.bigcode_15b_400m.0.2.reworded
num_bytes: 78788842
num_examples: 161
- name: humaneval.pl.bigcode_15b_600m.0.2.reworded
num_bytes: 69916889
num_examples: 161
- name: humaneval.pl.bigcode_15b_800m.0.2.reworded
num_bytes: 73552220
num_examples: 161
- name: humaneval.pl.codegeex.0.2.reworded
num_bytes: 72617126
num_examples: 161
- name: humaneval.pl.codegen.0.2.reworded
num_bytes: 74351768
num_examples: 161
- name: humaneval.pl.cushman001.0.2.reworded
num_bytes: 7317844
num_examples: 161
- name: humaneval.pl.replit_code.0.2.reworded
num_bytes: 23014112
num_examples: 161
- name: humaneval.py.bigcode_15b_200m.0.8.reworded
num_bytes: 55679581
num_examples: 161
- name: humaneval.py.bigcode_15b_400m.0.8.reworded
num_bytes: 49813429
num_examples: 161
- name: humaneval.py.replit_code.0.2.reworded
num_bytes: 16222771
num_examples: 161
- name: humaneval.py.starcoder.0.8.reworded
num_bytes: 50428866
num_examples: 161
- name: humaneval.py.starcoderprompted.0.1.reworded
num_bytes: 53971758
num_examples: 161
- name: humaneval.rb.bigcode_15b_1000m.0.2.reworded
num_bytes: 67446763
num_examples: 161
- name: humaneval.rb.bigcode_15b_200m.0.2.reworded
num_bytes: 70571683
num_examples: 161
- name: humaneval.rb.bigcode_15b_400m.0.2.reworded
num_bytes: 67565830
num_examples: 161
- name: humaneval.rb.bigcode_15b_600m.0.2.reworded
num_bytes: 71419194
num_examples: 161
- name: humaneval.rb.bigcode_15b_800m.0.2.reworded
num_bytes: 69995749
num_examples: 161
- name: humaneval.rb.codegeex.0.2.reworded
num_bytes: 63388920
num_examples: 161
- name: humaneval.rb.codegen.0.2.reworded
num_bytes: 68918022
num_examples: 161
- name: humaneval.rb.cushman001.0.2.reworded
num_bytes: 7084615
num_examples: 161
- name: humaneval.rb.replit_code.0.2.reworded
num_bytes: 17797810
num_examples: 161
- name: humaneval.r.codegeex.0.2.reworded
num_bytes: 64172735
num_examples: 161
- name: humaneval.r.codegen.0.2.reworded
num_bytes: 75777642
num_examples: 161
- name: humaneval.r.cushman001.0.2.reworded
num_bytes: 6509329
num_examples: 161
- name: humaneval.rkt.bigcode_15b_1000m.0.2.reworded
num_bytes: 71049799
num_examples: 161
- name: humaneval.rkt.bigcode_15b_200m.0.2.reworded
num_bytes: 72642020
num_examples: 161
- name: humaneval.rkt.bigcode_15b_400m.0.2.reworded
num_bytes: 73564249
num_examples: 161
- name: humaneval.rkt.bigcode_15b_600m.0.2.reworded
num_bytes: 73730273
num_examples: 161
- name: humaneval.rkt.codegeex.0.2.reworded
num_bytes: 70940774
num_examples: 161
- name: humaneval.rkt.codegen.0.2.reworded
num_bytes: 90161741
num_examples: 161
- name: humaneval.rkt.cushman001.0.2.reworded
num_bytes: 6030454
num_examples: 161
- name: humaneval.rkt.replit_code.0.2.reworded
num_bytes: 18423402
num_examples: 161
- name: humaneval.r.replit_code.0.2.reworded
num_bytes: 19677779
num_examples: 161
- name: humaneval.rs.bigcode_15b_1000m.0.2.reworded
num_bytes: 59702550
num_examples: 156
- name: humaneval.rs.bigcode_15b_200m.0.2.reworded
num_bytes: 67759741
num_examples: 156
- name: humaneval.rs.bigcode_15b_400m.0.2.reworded
num_bytes: 68044357
num_examples: 156
- name: humaneval.rs.bigcode_15b_600m.0.2.reworded
num_bytes: 54658037
num_examples: 156
- name: humaneval.rs.bigcode_15b_800m.0.2.reworded
num_bytes: 62854949
num_examples: 156
- name: humaneval.rs.codegeex.0.2.reworded
num_bytes: 67976569
num_examples: 156
- name: humaneval.rs.codegen.0.2.reworded
num_bytes: 82843583
num_examples: 156
- name: humaneval.rs.cushman001.0.2.reworded
num_bytes: 6310774
num_examples: 156
- name: humaneval.rs.replit_code.0.2.reworded
num_bytes: 17624999
num_examples: 156
- name: humaneval.scala.bigcode_15b_1000m.0.2.reworded
num_bytes: 84533532
num_examples: 160
- name: humaneval.scala.bigcode_15b_800m.0.2.reworded
num_bytes: 87023720
num_examples: 160
- name: humaneval.scala.codegeex.0.2.reworded
num_bytes: 79619828
num_examples: 160
- name: humaneval.scala.codegen.0.2.reworded
num_bytes: 128885303
num_examples: 160
- name: humaneval.scala.cushman001.0.2.reworded
num_bytes: 8500865
num_examples: 160
- name: humaneval.scala.replit_code.0.2.reworded
num_bytes: 22458222
num_examples: 160
- name: humaneval.sh.bigcode_15b_1000m.0.2.reworded
num_bytes: 62768941
num_examples: 158
- name: humaneval.sh.bigcode_15b_200m.0.2.reworded
num_bytes: 75630478
num_examples: 158
- name: humaneval.sh.bigcode_15b_400m.0.2.reworded
num_bytes: 77050658
num_examples: 158
- name: humaneval.sh.bigcode_15b_600m.0.2.reworded
num_bytes: 65325746
num_examples: 158
- name: humaneval.swift.bigcode_15b_600m.0.2.reworded
num_bytes: 70424335
num_examples: 158
- name: humaneval.scala.bigcode_15b_200m.0.2.reworded
num_bytes: 89054581
num_examples: 160
- name: humaneval.scala.bigcode_15b_400m.0.2.reworded
num_bytes: 83343360
num_examples: 160
- name: humaneval.scala.bigcode_15b_600m.0.2.reworded
num_bytes: 89752223
num_examples: 160
- name: humaneval.sh.bigcode_15b_800m.0.2.reworded
num_bytes: 66811937
num_examples: 158
- name: humaneval.sh.codegeex.0.2.reworded
num_bytes: 65196768
num_examples: 158
- name: humaneval.sh.codegen.0.2.reworded
num_bytes: 99280481
num_examples: 158
- name: humaneval.sh.cushman001.0.2.reworded
num_bytes: 6237965
num_examples: 158
- name: humaneval.sh.replit_code.0.2.reworded
num_bytes: 18134838
num_examples: 158
- name: humaneval.swift.bigcode_15b_1000m.0.2.reworded
num_bytes: 68129948
num_examples: 158
- name: humaneval.swift.bigcode_15b_200m.0.2.reworded
num_bytes: 76924134
num_examples: 158
- name: humaneval.swift.bigcode_15b_400m.0.2.reworded
num_bytes: 72042977
num_examples: 158
- name: humaneval.swift.bigcode_15b_800m.0.2.reworded
num_bytes: 70027106
num_examples: 158
- name: humaneval.swift.codegeex.0.2.reworded
num_bytes: 73605273
num_examples: 158
- name: humaneval.swift.codegen.0.2.reworded
num_bytes: 76081675
num_examples: 158
- name: humaneval.swift.cushman001.0.2.reworded
num_bytes: 6766506
num_examples: 158
- name: humaneval.swift.replit_code.0.2.reworded
num_bytes: 21605861
num_examples: 158
- name: humaneval.ts.bigcode_15b_1000m.0.2.reworded
num_bytes: 61005831
num_examples: 159
- name: humaneval.ts.bigcode_15b_200m.0.2.reworded
num_bytes: 68875546
num_examples: 159
- name: humaneval.ts.bigcode_15b_400m.0.2.reworded
num_bytes: 62805583
num_examples: 159
- name: humaneval.ts.bigcode_15b_600m.0.2.reworded
num_bytes: 53733690
num_examples: 159
- name: humaneval.ts.bigcode_15b_800m.0.2.reworded
num_bytes: 64371975
num_examples: 159
- name: humaneval.ts.codegeex.0.2.reworded
num_bytes: 58487751
num_examples: 159
- name: humaneval.ts.codegen.0.2.reworded
num_bytes: 69981611
num_examples: 159
- name: humaneval.ts.cushman001.0.2.reworded
num_bytes: 6768589
num_examples: 159
- name: humaneval.ts.replit_code.0.2.reworded
num_bytes: 18208741
num_examples: 159
- name: mbpp.py.codegeex.0.2.reworded
num_bytes: 86906502
num_examples: 397
- name: mbpp.py.codegen.0.2.reworded
num_bytes: 92562493
num_examples: 397
- name: mbpp.py.cushman001.0.2.reworded
num_bytes: 7629346
num_examples: 397
- name: mbpp.py.pystarcoder2.0.1.reworded
num_bytes: 142442817
num_examples: 397
- name: mbpp.py.Salesforce_codegen_16B_mono.0.2.reworded
num_bytes: 86067040
num_examples: 397
- name: humaneval.matlab.keep.gpt_35_turbo_0301.0.2.reworded
num_bytes: 4837906
num_examples: 161
- name: humaneval.cpp.codellama_13b_base.0.2.reworded
num_bytes: 17224400
num_examples: 161
- name: humaneval.cpp.codellama_7b_base.0.2.reworded
num_bytes: 16947382
num_examples: 161
- name: humaneval.cpp.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 17349817
num_examples: 161
- name: humaneval.cpp.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 16452450
num_examples: 161
- name: humaneval.cpp.stablecode3b.0.2.reworded
num_bytes: 67319279
num_examples: 161
- name: humaneval.cpp.StarCoder2_15b_16k.0.2.reworded
num_bytes: 16464290
num_examples: 161
- name: humaneval.cpp.starcoder2_3b_long.0.2.reworded
num_bytes: 6912758
num_examples: 161
- name: humaneval.cpp.StarCoder2_7b_16k.0.2.reworded
num_bytes: 16812656
num_examples: 161
- name: humaneval.cs.codellama_13b_base.0.2.reworded
num_bytes: 27515677
num_examples: 158
- name: humaneval.cs.CodeLlama_34b_base.0.2.reworded
num_bytes: 27108848
num_examples: 158
- name: humaneval.cs.codellama_7b_base.0.2.reworded
num_bytes: 25063010
num_examples: 158
- name: humaneval.cs.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 29224686
num_examples: 158
- name: humaneval.cs.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 26768709
num_examples: 158
- name: humaneval.cs.DeepSeekCoder_34b_base.0.2.reworded
num_bytes: 25747311
num_examples: 158
- name: humaneval.cs.stablecode3b.0.2.reworded
num_bytes: 105810688
num_examples: 158
- name: humaneval.cs.StarCoder2_15b_16k.0.2.reworded
num_bytes: 24656854
num_examples: 158
- name: humaneval.cs.starcoder2_3b_long.0.2.reworded
num_bytes: 10211975
num_examples: 158
- name: humaneval.cs.StarCoder2_7b_16k.0.2.reworded
num_bytes: 24549204
num_examples: 158
- name: humaneval.cs.starcoderbase_3b.0.2.reworded
num_bytes: 30212965
num_examples: 158
- name: humaneval.cs.starcoderbase_7b.0.2.reworded
num_bytes: 29671445
num_examples: 158
- name: humaneval.d.codellama_13b_base.0.2.reworded
num_bytes: 16540135
num_examples: 156
- name: humaneval.d.codellama_7b_base.0.2.reworded
num_bytes: 16378561
num_examples: 156
- name: humaneval.d.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 16380578
num_examples: 156
- name: humaneval.d.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 15678858
num_examples: 156
- name: humaneval.d.stablecode3b.0.2.reworded
num_bytes: 35933717
num_examples: 92
- name: humaneval.d.StarCoder2_15b_16k.0.2.reworded
num_bytes: 15488865
num_examples: 156
- name: humaneval.d.starcoder2_3b_long.0.2.reworded
num_bytes: 6448842
num_examples: 156
- name: humaneval.d.StarCoder2_7b_16k.0.2.reworded
num_bytes: 14879550
num_examples: 156
- name: humaneval.go.codellama_13b_base.0.2.reworded
num_bytes: 18624670
num_examples: 154
- name: humaneval.go.CodeLlama_34b_base.0.2.reworded
num_bytes: 18348739
num_examples: 154
- name: humaneval.go.codellama_7b_base.0.2.reworded
num_bytes: 18328204
num_examples: 154
- name: humaneval.go.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 18484006
num_examples: 154
- name: humaneval.go.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 18461448
num_examples: 154
- name: humaneval.go.DeepSeekCoder_34b_base.0.2.reworded
num_bytes: 17594569
num_examples: 154
- name: humaneval.go.stablecode3b.0.2.reworded
num_bytes: 76254627
num_examples: 154
- name: humaneval.go.StarCoder2_15b_16k.0.2.reworded
num_bytes: 17439839
num_examples: 154
- name: humaneval.go.starcoder2_3b_long.0.2.reworded
num_bytes: 7602923
num_examples: 154
- name: humaneval.go.StarCoder2_7b_16k.0.2.reworded
num_bytes: 17408959
num_examples: 154
- name: humaneval.go.starcoderbase_3b.0.2.reworded
num_bytes: 21037781
num_examples: 154
- name: humaneval.go.starcoderbase_7b.0.2.reworded
num_bytes: 19796229
num_examples: 154
- name: humaneval.java.codellama_13b_base.0.2.reworded
num_bytes: 19317231
num_examples: 158
- name: humaneval.java.codellama_7b_base.0.2.reworded
num_bytes: 18319565
num_examples: 158
- name: humaneval.java.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 19864347
num_examples: 158
- name: humaneval.java.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 18625198
num_examples: 158
- name: humaneval.java.stablecode3b.0.2.reworded
num_bytes: 74579340
num_examples: 158
- name: humaneval.java.StarCoder2_15b_16k.0.2.reworded
num_bytes: 17514550
num_examples: 158
- name: humaneval.java.starcoder2_3b_long.0.2.reworded
num_bytes: 7463704
num_examples: 158
- name: humaneval.java.StarCoder2_7b_16k.0.2.reworded
num_bytes: 18302479
num_examples: 158
- name: humaneval.jl.codellama_13b_base.0.2.reworded
num_bytes: 19477558
num_examples: 159
- name: humaneval.jl.codellama_7b_base.0.2.reworded
num_bytes: 20001814
num_examples: 159
- name: humaneval.jl.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 22131151
num_examples: 159
- name: humaneval.jl.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 18334155
num_examples: 159
- name: humaneval.jl.stablecode3b.0.2.reworded
num_bytes: 82666454
num_examples: 159
- name: humaneval.jl.StarCoder2_15b_16k.0.2.reworded
num_bytes: 19036610
num_examples: 159
- name: humaneval.jl.starcoder2_3b_long.0.2.reworded
num_bytes: 8334068
num_examples: 159
- name: humaneval.jl.StarCoder2_7b_16k.0.2.reworded
num_bytes: 20931800
num_examples: 159
- name: humaneval.js.codellama_13b_base.0.2.reworded
num_bytes: 16473024
num_examples: 161
- name: humaneval.js.codellama_7b_base.0.2.reworded
num_bytes: 16582420
num_examples: 161
- name: humaneval.js.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 16716270
num_examples: 161
- name: humaneval.js.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 15173546
num_examples: 161
- name: humaneval.js.stablecode3b.0.2.reworded
num_bytes: 64385566
num_examples: 161
- name: humaneval.js.StarCoder2_15b_16k.0.2.reworded
num_bytes: 15300799
num_examples: 161
- name: humaneval.js.starcoder2_3b_long.0.2.reworded
num_bytes: 6351328
num_examples: 161
- name: humaneval.js.StarCoder2_7b_16k.0.2.reworded
num_bytes: 15697228
num_examples: 161
- name: humaneval.lua.codellama_13b_base.0.2.reworded
num_bytes: 13273956
num_examples: 161
- name: humaneval.lua.codellama_7b_base.0.2.reworded
num_bytes: 13559092
num_examples: 161
- name: humaneval.lua.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 14465897
num_examples: 161
- name: humaneval.lua.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 13708591
num_examples: 161
- name: humaneval.lua.stablecode3b.0.2.reworded
num_bytes: 56129300
num_examples: 161
- name: humaneval.lua.StarCoder2_15b_16k.0.2.reworded
num_bytes: 13667740
num_examples: 161
- name: humaneval.lua.starcoder2_3b_long.0.2.reworded
num_bytes: 5510129
num_examples: 161
- name: humaneval.lua.StarCoder2_7b_16k.0.2.reworded
num_bytes: 13085989
num_examples: 161
- name: humaneval.php.codellama_13b_base.0.2.reworded
num_bytes: 15705506
num_examples: 161
- name: humaneval.php.codellama_7b_base.0.2.reworded
num_bytes: 15787570
num_examples: 161
- name: humaneval.php.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 15814055
num_examples: 161
- name: humaneval.php.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 15702317
num_examples: 161
- name: humaneval.php.stablecode3b.0.2.reworded
num_bytes: 62279235
num_examples: 161
- name: humaneval.php.StarCoder2_15b_16k.0.2.reworded
num_bytes: 15762455
num_examples: 161
- name: humaneval.php.starcoder2_3b_long.0.2.reworded
num_bytes: 6256161
num_examples: 161
- name: humaneval.php.StarCoder2_7b_16k.0.2.reworded
num_bytes: 15173539
num_examples: 161
- name: humaneval.pl.codellama_13b_base.0.2.reworded
num_bytes: 18073447
num_examples: 161
- name: humaneval.pl.CodeLlama_34b_base.0.2.reworded
num_bytes: 17163359
num_examples: 161
- name: humaneval.pl.codellama_7b_base.0.2.reworded
num_bytes: 17854674
num_examples: 161
- name: humaneval.pl.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 18760773
num_examples: 161
- name: humaneval.pl.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 17873165
num_examples: 161
- name: humaneval.pl.DeepSeekCoder_34b_base.0.2.reworded
num_bytes: 17282729
num_examples: 161
- name: humaneval.pl.stablecode3b.0.2.reworded
num_bytes: 71926624
num_examples: 161
- name: humaneval.pl.StarCoder2_15b_16k.0.2.reworded
num_bytes: 17260449
num_examples: 161
- name: humaneval.pl.starcoder2_3b_long.0.2.reworded
num_bytes: 7323910
num_examples: 161
- name: humaneval.pl.StarCoder2_7b_16k.0.2.reworded
num_bytes: 17386798
num_examples: 161
- name: humaneval.pl.starcoderbase_3b.0.2.reworded
num_bytes: 17425724
num_examples: 161
- name: humaneval.pl.starcoderbase_7b.0.2.reworded
num_bytes: 17232522
num_examples: 161
- name: humaneval.rb.codellama_13b_base.0.2.reworded
num_bytes: 16924279
num_examples: 161
- name: humaneval.rb.CodeLlama_34b_base.0.2.reworded
num_bytes: 16076508
num_examples: 161
- name: humaneval.rb.codellama_7b_base.0.2.reworded
num_bytes: 17352418
num_examples: 161
- name: humaneval.rb.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 17880997
num_examples: 161
- name: humaneval.rb.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 16637852
num_examples: 161
- name: humaneval.rb.DeepSeekCoder_34b_base.0.2.reworded
num_bytes: 15774077
num_examples: 161
- name: humaneval.rb.stablecode3b.0.2.reworded
num_bytes: 67134234
num_examples: 161
- name: humaneval.rb.StarCoder2_15b_16k.0.2.reworded
num_bytes: 16344062
num_examples: 161
- name: humaneval.rb.starcoder2_3b_long.0.2.reworded
num_bytes: 6938906
num_examples: 161
- name: humaneval.rb.StarCoder2_7b_16k.0.2.reworded
num_bytes: 16973867
num_examples: 161
- name: humaneval.rb.starcoderbase_3b.0.2.reworded
num_bytes: 17503070
num_examples: 161
- name: humaneval.rb.starcoderbase_7b.0.2.reworded
num_bytes: 17444427
num_examples: 161
- name: humaneval.r.codellama_13b_base.0.2.reworded
num_bytes: 16765203
num_examples: 161
- name: humaneval.r.codellama_7b_base.0.2.reworded
num_bytes: 16042879
num_examples: 161
- name: humaneval.r.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 18188961
num_examples: 161
- name: humaneval.r.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 15927073
num_examples: 161
- name: humaneval.r.DeepSeekCoder_34b_base.0.2.reworded
num_bytes: 11519925
num_examples: 161
- name: humaneval.rkt.codellama_13b_base.0.2.reworded
num_bytes: 17815474
num_examples: 161
- name: humaneval.rkt.codellama_7b_base.0.2.reworded
num_bytes: 17859177
num_examples: 161
- name: humaneval.rkt.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 17714145
num_examples: 161
- name: humaneval.rkt.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 17785261
num_examples: 161
- name: humaneval.rkt.stablecode3b.0.2.reworded
num_bytes: 70190960
num_examples: 161
- name: humaneval.rkt.StarCoder2_15b_16k.0.2.reworded
num_bytes: 16095558
num_examples: 161
- name: humaneval.rkt.starcoder2_3b_long.0.2.reworded
num_bytes: 7229090
num_examples: 161
- name: humaneval.rkt.StarCoder2_7b_16k.0.2.reworded
num_bytes: 16284554
num_examples: 161
- name: humaneval.rs.codellama_13b_base.0.2.reworded
num_bytes: 15195007
num_examples: 156
- name: humaneval.rs.codellama_7b_base.0.2.reworded
num_bytes: 15714251
num_examples: 156
- name: humaneval.rs.deepseekcoder_1.3b_base.0.2.reworded
num_bytes: 15792067
num_examples: 156
- name: humaneval.rs.deepseekcoder1.5_7b_base.0.2.reworded
num_bytes: 14351037
num_examples: 156
- name: humaneval.rs.stablecode3b.0.2.reworded
num_bytes: 61739739
num_examples: 156
- name: humaneval.rs.StarCoder2_15b_16k.0.2.reworded
num_bytes: 14340153
num_examples: 156
- name: humaneval.rs.starcoder2_3b_long.0.2.reworded
num_bytes: 6139379
num_examples: 156
- name: humaneval.rs.StarCoder2_7b_16k.0.2.reworded
num_bytes: 14671151
num_examples: 156
- name: humaneval.r.stablecode3b.0.2.reworded
num_bytes: 62027260
num_examples: 161
- name: humaneval.r.StarCoder2_15b_16k.0.2.reworded
num_bytes: 14198671
num_examples: 161
- name: humaneval.r.starcoder2_3b_long.0.2.reworded
num_bytes: 6471459
num_examples: 161
download_size: 1518755447
dataset_size: 17591575904
tags:
- code
configs:
- config_name: default
data_files:
- split: humaneval.cpp.codellama_13b_base.0.2.reworded
path: data/humaneval.cpp.codellama_13b_base.0.2.reworded-*
- split: humaneval.cpp.codellama_7b_base.0.2.reworded
path: data/humaneval.cpp.codellama_7b_base.0.2.reworded-*
- split: humaneval.cpp.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.cpp.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.cpp.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.cpp.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.cpp.stablecode3b.0.2.reworded
path: data/humaneval.cpp.stablecode3b.0.2.reworded-*
- split: humaneval.cpp.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.cpp.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.cpp.starcoder2_3b_long.0.2.reworded
path: data/humaneval.cpp.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.cpp.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.cpp.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.cs.codellama_13b_base.0.2.reworded
path: data/humaneval.cs.codellama_13b_base.0.2.reworded-*
- split: humaneval.cs.CodeLlama_34b_base.0.2.reworded
path: data/humaneval.cs.CodeLlama_34b_base.0.2.reworded-*
- split: humaneval.cs.codellama_7b_base.0.2.reworded
path: data/humaneval.cs.codellama_7b_base.0.2.reworded-*
- split: humaneval.cs.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.cs.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.cs.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.cs.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.cs.DeepSeekCoder_34b_base.0.2.reworded
path: data/humaneval.cs.DeepSeekCoder_34b_base.0.2.reworded-*
- split: humaneval.cs.stablecode3b.0.2.reworded
path: data/humaneval.cs.stablecode3b.0.2.reworded-*
- split: humaneval.cs.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.cs.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.cs.starcoder2_3b_long.0.2.reworded
path: data/humaneval.cs.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.cs.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.cs.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.cs.starcoderbase_3b.0.2.reworded
path: data/humaneval.cs.starcoderbase_3b.0.2.reworded-*
- split: humaneval.cs.starcoderbase_7b.0.2.reworded
path: data/humaneval.cs.starcoderbase_7b.0.2.reworded-*
- split: humaneval.d.codellama_13b_base.0.2.reworded
path: data/humaneval.d.codellama_13b_base.0.2.reworded-*
- split: humaneval.d.codellama_7b_base.0.2.reworded
path: data/humaneval.d.codellama_7b_base.0.2.reworded-*
- split: humaneval.d.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.d.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.d.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.d.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.d.stablecode3b.0.2.reworded
path: data/humaneval.d.stablecode3b.0.2.reworded-*
- split: humaneval.d.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.d.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.d.starcoder2_3b_long.0.2.reworded
path: data/humaneval.d.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.d.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.d.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.go.codellama_13b_base.0.2.reworded
path: data/humaneval.go.codellama_13b_base.0.2.reworded-*
- split: humaneval.go.CodeLlama_34b_base.0.2.reworded
path: data/humaneval.go.CodeLlama_34b_base.0.2.reworded-*
- split: humaneval.go.codellama_7b_base.0.2.reworded
path: data/humaneval.go.codellama_7b_base.0.2.reworded-*
- split: humaneval.go.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.go.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.go.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.go.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.go.DeepSeekCoder_34b_base.0.2.reworded
path: data/humaneval.go.DeepSeekCoder_34b_base.0.2.reworded-*
- split: humaneval.go.stablecode3b.0.2.reworded
path: data/humaneval.go.stablecode3b.0.2.reworded-*
- split: humaneval.go.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.go.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.go.starcoder2_3b_long.0.2.reworded
path: data/humaneval.go.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.go.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.go.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.go.starcoderbase_3b.0.2.reworded
path: data/humaneval.go.starcoderbase_3b.0.2.reworded-*
- split: humaneval.go.starcoderbase_7b.0.2.reworded
path: data/humaneval.go.starcoderbase_7b.0.2.reworded-*
- split: humaneval.java.codellama_13b_base.0.2.reworded
path: data/humaneval.java.codellama_13b_base.0.2.reworded-*
- split: humaneval.java.codellama_7b_base.0.2.reworded
path: data/humaneval.java.codellama_7b_base.0.2.reworded-*
- split: humaneval.java.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.java.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.java.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.java.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.java.stablecode3b.0.2.reworded
path: data/humaneval.java.stablecode3b.0.2.reworded-*
- split: humaneval.java.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.java.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.java.starcoder2_3b_long.0.2.reworded
path: data/humaneval.java.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.java.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.java.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.jl.codellama_13b_base.0.2.reworded
path: data/humaneval.jl.codellama_13b_base.0.2.reworded-*
- split: humaneval.jl.codellama_7b_base.0.2.reworded
path: data/humaneval.jl.codellama_7b_base.0.2.reworded-*
- split: humaneval.jl.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.jl.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.jl.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.jl.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.jl.stablecode3b.0.2.reworded
path: data/humaneval.jl.stablecode3b.0.2.reworded-*
- split: humaneval.jl.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.jl.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.jl.starcoder2_3b_long.0.2.reworded
path: data/humaneval.jl.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.jl.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.jl.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.js.codellama_13b_base.0.2.reworded
path: data/humaneval.js.codellama_13b_base.0.2.reworded-*
- split: humaneval.js.codellama_7b_base.0.2.reworded
path: data/humaneval.js.codellama_7b_base.0.2.reworded-*
- split: humaneval.js.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.js.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.js.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.js.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.js.stablecode3b.0.2.reworded
path: data/humaneval.js.stablecode3b.0.2.reworded-*
- split: humaneval.js.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.js.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.js.starcoder2_3b_long.0.2.reworded
path: data/humaneval.js.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.js.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.js.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.lua.codellama_13b_base.0.2.reworded
path: data/humaneval.lua.codellama_13b_base.0.2.reworded-*
- split: humaneval.lua.codellama_7b_base.0.2.reworded
path: data/humaneval.lua.codellama_7b_base.0.2.reworded-*
- split: humaneval.lua.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.lua.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.lua.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.lua.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.lua.stablecode3b.0.2.reworded
path: data/humaneval.lua.stablecode3b.0.2.reworded-*
- split: humaneval.lua.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.lua.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.lua.starcoder2_3b_long.0.2.reworded
path: data/humaneval.lua.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.lua.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.lua.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.php.codellama_13b_base.0.2.reworded
path: data/humaneval.php.codellama_13b_base.0.2.reworded-*
- split: humaneval.php.codellama_7b_base.0.2.reworded
path: data/humaneval.php.codellama_7b_base.0.2.reworded-*
- split: humaneval.php.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.php.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.php.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.php.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.php.stablecode3b.0.2.reworded
path: data/humaneval.php.stablecode3b.0.2.reworded-*
- split: humaneval.php.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.php.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.php.starcoder2_3b_long.0.2.reworded
path: data/humaneval.php.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.php.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.php.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.pl.codellama_13b_base.0.2.reworded
path: data/humaneval.pl.codellama_13b_base.0.2.reworded-*
- split: humaneval.pl.CodeLlama_34b_base.0.2.reworded
path: data/humaneval.pl.CodeLlama_34b_base.0.2.reworded-*
- split: humaneval.pl.codellama_7b_base.0.2.reworded
path: data/humaneval.pl.codellama_7b_base.0.2.reworded-*
- split: humaneval.pl.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.pl.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.pl.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.pl.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.pl.DeepSeekCoder_34b_base.0.2.reworded
path: data/humaneval.pl.DeepSeekCoder_34b_base.0.2.reworded-*
- split: humaneval.pl.stablecode3b.0.2.reworded
path: data/humaneval.pl.stablecode3b.0.2.reworded-*
- split: humaneval.pl.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.pl.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.pl.starcoder2_3b_long.0.2.reworded
path: data/humaneval.pl.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.pl.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.pl.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.pl.starcoderbase_3b.0.2.reworded
path: data/humaneval.pl.starcoderbase_3b.0.2.reworded-*
- split: humaneval.pl.starcoderbase_7b.0.2.reworded
path: data/humaneval.pl.starcoderbase_7b.0.2.reworded-*
- split: humaneval.rb.codellama_13b_base.0.2.reworded
path: data/humaneval.rb.codellama_13b_base.0.2.reworded-*
- split: humaneval.rb.CodeLlama_34b_base.0.2.reworded
path: data/humaneval.rb.CodeLlama_34b_base.0.2.reworded-*
- split: humaneval.rb.codellama_7b_base.0.2.reworded
path: data/humaneval.rb.codellama_7b_base.0.2.reworded-*
- split: humaneval.rb.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.rb.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.rb.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.rb.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.rb.DeepSeekCoder_34b_base.0.2.reworded
path: data/humaneval.rb.DeepSeekCoder_34b_base.0.2.reworded-*
- split: humaneval.rb.stablecode3b.0.2.reworded
path: data/humaneval.rb.stablecode3b.0.2.reworded-*
- split: humaneval.rb.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.rb.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.rb.starcoder2_3b_long.0.2.reworded
path: data/humaneval.rb.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.rb.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.rb.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.rb.starcoderbase_3b.0.2.reworded
path: data/humaneval.rb.starcoderbase_3b.0.2.reworded-*
- split: humaneval.rb.starcoderbase_7b.0.2.reworded
path: data/humaneval.rb.starcoderbase_7b.0.2.reworded-*
- split: humaneval.r.codellama_13b_base.0.2.reworded
path: data/humaneval.r.codellama_13b_base.0.2.reworded-*
- split: humaneval.r.codellama_7b_base.0.2.reworded
path: data/humaneval.r.codellama_7b_base.0.2.reworded-*
- split: humaneval.r.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.r.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.r.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.r.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.r.DeepSeekCoder_34b_base.0.2.reworded
path: data/humaneval.r.DeepSeekCoder_34b_base.0.2.reworded-*
- split: humaneval.rkt.codellama_13b_base.0.2.reworded
path: data/humaneval.rkt.codellama_13b_base.0.2.reworded-*
- split: humaneval.rkt.codellama_7b_base.0.2.reworded
path: data/humaneval.rkt.codellama_7b_base.0.2.reworded-*
- split: humaneval.rkt.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.rkt.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.rkt.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.rkt.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.rkt.stablecode3b.0.2.reworded
path: data/humaneval.rkt.stablecode3b.0.2.reworded-*
- split: humaneval.rkt.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.rkt.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.rkt.starcoder2_3b_long.0.2.reworded
path: data/humaneval.rkt.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.rkt.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.rkt.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.rs.codellama_13b_base.0.2.reworded
path: data/humaneval.rs.codellama_13b_base.0.2.reworded-*
- split: humaneval.rs.codellama_7b_base.0.2.reworded
path: data/humaneval.rs.codellama_7b_base.0.2.reworded-*
- split: humaneval.rs.deepseekcoder_1.3b_base.0.2.reworded
path: data/humaneval.rs.deepseekcoder_1.3b_base.0.2.reworded-*
- split: humaneval.rs.deepseekcoder1.5_7b_base.0.2.reworded
path: data/humaneval.rs.deepseekcoder1.5_7b_base.0.2.reworded-*
- split: humaneval.rs.stablecode3b.0.2.reworded
path: data/humaneval.rs.stablecode3b.0.2.reworded-*
- split: humaneval.rs.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.rs.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.rs.starcoder2_3b_long.0.2.reworded
path: data/humaneval.rs.starcoder2_3b_long.0.2.reworded-*
- split: humaneval.rs.StarCoder2_7b_16k.0.2.reworded
path: data/humaneval.rs.StarCoder2_7b_16k.0.2.reworded-*
- split: humaneval.r.stablecode3b.0.2.reworded
path: data/humaneval.r.stablecode3b.0.2.reworded-*
- split: humaneval.r.StarCoder2_15b_16k.0.2.reworded
path: data/humaneval.r.StarCoder2_15b_16k.0.2.reworded-*
- split: humaneval.r.starcoder2_3b_long.0.2.reworded
path: data/humaneval.r.starcoder2_3b_long.0.2.reworded-*
---
# Raw Data from MultiPL-E
**Uploads are a work in progress. If you are interested in a split that is not yet available, please contact a.guha@northeastern.edu.**
This repository contains the raw data -- both completions and executions -- from MultiPL-E that was used to generate several experimental results from the
MultiPL-E, SantaCoder, and StarCoder papers.
The original MultiPL-E completions and executions are stored in JOSN files. We use [the following script](https://github.com/nuprl/MultiPL-E/blob/main/upload_completions.py-)
to turn each experiment directory into a dataset split and upload to this repository.
Every split is named `base_dataset`.`language`.`model`.`temperature`.`variation`
- `base_dataset` is either `humaneval` or `mbpp`.
- `language` is the file extension of the programming language. E.g., `py` for Python or `sh` for Bash.
- `model` is the name of the model. Some model names used by MultiPL-E:
- `bigcode_15b_1000m`: StarCoderBase
- `bigcode_15b_200m`, `bigcode_15b_400m`, `bigcode_15b_600m`, `bigcode_15b_800m`: the 20%, 40%, 60%, 80% checkpoints for StarCoderBase
- `pystarcoder2`: StarCoder
- there are others, just have a look :)
- `temperature`: this is either 0.2 or 0.8
- `variation`: the variation of the MultiPL-E prompt to use, which should be `reworded`. MultiPL-E has several prompt ablations and the `reworded` prompts are the canonical variation. |
davanstrien/model_cards_with_readmes | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: author
dtype: string
- name: model_type
dtype: string
- name: files_per_repo
dtype: int64
- name: downloads_30d
dtype: int64
- name: library
dtype: string
- name: likes
dtype: int64
- name: pipeline
dtype: string
- name: pytorch
dtype: bool
- name: tensorflow
dtype: bool
- name: jax
dtype: bool
- name: license
dtype: string
- name: languages
dtype: string
- name: datasets
dtype: string
- name: co2
dtype: string
- name: prs_count
dtype: int64
- name: prs_open
dtype: int64
- name: prs_merged
dtype: int64
- name: prs_closed
dtype: int64
- name: discussions_count
dtype: int64
- name: discussions_open
dtype: int64
- name: discussions_closed
dtype: int64
- name: tags
dtype: string
- name: has_model_index
dtype: bool
- name: has_metadata
dtype: bool
- name: has_text
dtype: bool
- name: text_length
dtype: int64
- name: is_nc
dtype: bool
- name: readme
dtype: string
- name: hash
dtype: string
splits:
- name: train
num_bytes: 91746845.07931802
num_examples: 29806
download_size: 37088334
dataset_size: 91746845.07931802
---
# Dataset Card for "model_cards_with_readmes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
enoahjr/twitter_dataset_1713143198 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 139098
num_examples: 390
download_size: 46727
dataset_size: 139098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hayesyang/news | ---
dataset_info:
features:
- name: url
dtype: string
- name: content
dtype: string
splits:
- name: zh
num_bytes: 342700881
num_examples: 2771
- name: en
num_bytes: 291917240
num_examples: 2258
- name: fr
num_bytes: 154707197
num_examples: 1201
- name: es
num_bytes: 221805819
num_examples: 1695
- name: ru
num_bytes: 121776777
num_examples: 926
- name: ar
num_bytes: 118422112
num_examples: 883
download_size: 528278861
dataset_size: 1251330026
---
# Dataset Card for "news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bazsalanszky/reddit_hu | ---
language:
- hu
license: cc-by-3.0
pretty_name: r
dataset_info:
features:
- name: title
dtype: string
- name: author_flair_text
dtype: string
- name: selftext
dtype: string
- name: subreddit
dtype: string
- name: is_video
dtype: bool
- name: num_crossposts
dtype: int64
- name: subreddit_subscribers
dtype: int64
- name: url
dtype: string
- name: num_comments
dtype: int64
- name: author
dtype: string
- name: treatment_tags
sequence: 'null'
- name: all_awardings
sequence: 'null'
- name: is_crosspostable
dtype: bool
- name: view_count
dtype: 'null'
- name: after
dtype: string
- name: downs
dtype: int64
- name: ups
dtype: int64
- name: comments
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
list:
- name: author
dtype: string
- name: body
dtype: string
- name: downs
dtype: int64
- name: replies
sequence: 'null'
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
- name: ups
dtype: int64
splits:
- name: train
num_bytes: 1447024568
num_examples: 138944
download_size: 736424735
dataset_size: 1447024568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Magyar reddit adathalmaz
Ez az adathalmaz egy átfogó gyűjteményt tartalmaz, körülbelül 140 000 Reddit bejegyzéssel az r/hungary és r/askhungary subredditekről (későbbiekben több is lehet), hozzászólásokkal együtt (bár nem mindegyikhez). Az adathalmaz különösen az utóbbi pár hét posztjait öleli fel, és célja, hogy támogatást nyújtson az informális magyar nyelvtanításban, különösen nagynyelvi modellek fejlesztéséhez. A gyűjtemény gazdag forrása a különböző témákban folytatott vitáknak, véleményeknek és lekérdezéseknek, amelyek kiváló alapot biztosítanak a nyelv elsajátításához a valóságban használt nyelvezettel.
## Bias és Korlátozások
Fontos megjegyezni, hogy bár ez az adathalmaz rendkívül hasznos lehet a magyar nyelvtanítás szempontjából, tartalmazhat bizonyos fajta előítéleteket vagy biasokat, amelyek a Reddit felhasználói közösségének véleményein alapulnak. Az ilyen típusú adatok elemzésekor és felhasználásakor érdemes figyelembe venni, hogy a vélemények és témák reprezentatív jellege korlátozott lehet, és nem feltétlenül tükrözik a magyar nyelvű közösség vagy a magyar kultúra teljes spektrumát. Ezért ajánlott kritikai szemmel megközelíteni az adatokat, és törekedni a különböző forrásokból származó információk integrálására a kiegyensúlyozottabb és átfogóbb megértés érdekében a nyelvtanítás terén. |
sauravjoshi23/hotpot_qa_llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 516711452
num_examples: 90447
download_size: 296153466
dataset_size: 516711452
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_229 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 931160344
num_examples: 181442
download_size: 946816741
dataset_size: 931160344
---
# Dataset Card for "chunk_229"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fvr2/dataset-test02 | ---
license: other
task_categories:
- text-generation
language:
- en
tags:
- music
--- |
MattyB95/Synthetic_Voice_Detection_Resources | ---
license: mit
task_categories:
- audio-classification
language:
- en
tags:
- code
pretty_name: Synthetic Voice Detection Resources (VoxCelebSpoof)
size_categories:
- n<1K
--- |
mietlinski/parking_labeled_cropped | ---
license: mit
---
|
SkyHuReal/DrugBank-Alpaca | ---
license: afl-3.0
---
|
Qqcf16426/mangaupdates | ---
language:
- en
tags:
- manga
- tags
- genres
- scraped
size_categories:
- 100K<n<1M
---
I scraped [mangaupdates](https://www.mangaupdates.com) for a project and i am sharing the data. There is a tar file which contians the json response from every infos entry.
I parsed it and added it to a postgres database. The pgdump was uploaded too. There are some entries that do not exist anymore. It can be found in the removed ids json.
<details>
<summary>SQL structure</summary>
I didnt try to make it a optimal strucure, but i tried to remove the redundancy of strings.
### Info
```sql
create table info
(
id serial primary key,
private_id int,
public_id bigint not null,
forum_id bigint not null,
url_key text not null,
url_name text,
titles text[] not null,
description text,
image_name text,
typ int not null,
year int,
latest_chapter integer not null,
rating integer not null,
bayesian_rating float,
genres int[] not null,
tags int[] not null,
tags_upvotes int[] not null,
tags_downvotes int[] not null,
tags_uploader bigint[] not null,
status text,
licensed boolean not null,
completed boolean not null,
author int[] not null,
artist int[] not null,
publisher_original int[] not null,
publisher_english int[] not null,
publication text[] not null,
publication_publisher int[] not null,
relations text[] not null,
anime_start text,
anime_end text,
last_updated_mu TIMESTAMP,
last_updated TIMESTAMP not null,
created TIMESTAMP not null
);
```
### Types
```sql
create table if not exists mtypes
(
id serial primary key,
name text not null
);
```
### Genres
```sql
create table if not exists genres
(
id serial primary key,
name text not null
);
```
### Tags
```sql
create table if not exists tags
(
id serial primary key,
name text not null
);
```
### People
```sql
create table if not exists ppl
(
id serial primary key,
mu_id bigint,
name text not null
);
```
</details> |
neuralchen/VGGFace2-HQ | ---
license: apache-2.0
---
|
bprateek/amazon_product_description | ---
license: apache-2.0
---
|
kopan/docfullstructure_dataset | ---
task_categories:
- text-classification
language:
- ru
- en
- kk
- bg
- ca
- cs
- da
- de
- el
- es
- fi
- fr
- hr
- hu
- it
- jp
- ko
- ky
- lt
- mk
- nl
- 'no'
- pl
- pt
- ro
- sl
- sr
- tr
- uk
- zh
pretty_name: DocFullStructure
size_categories:
- n<1K
tags:
- scientific
- academic
- document
license: apache-2.0
--- |
Sleoruiz/discursos-primera-class-separated-by-idx | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
- name: comision
dtype: string
- name: gaceta_numero
dtype: string
- name: fecha_gaceta
dtype: string
- name: labels
sequence: string
- name: scores
sequence: float64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 33715220
num_examples: 21172
download_size: 16042383
dataset_size: 33715220
---
# Dataset Card for "discursos-primera-class-separated-by-idx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EdBerg/baha | ---
license: openrail
---
|
Bhandari10/CUB-200-2011-Ne | ---
configs:
- config_name: default
data_files:
- split: train
path:
- text_c10_nepali/002.Laysan_Albatross/Laysan_Albatross_0002_1027.txt
- text_c10_nepali/002.Laysan_Albatross/Laysan_Albatross_0003_1033.txt
- text_c10_nepali/002.Laysan_Albatross/Laysan_Albatross_0082_524.txt
- text_c10_nepali/002.Laysan_Albatross/Laysan_Albatross_0044_784.txt
- text_c10_nepali/002.Laysan_Albatross/Laysan_Albatross_0070_788.txt
- split: test
path:
- text_c10_nepali/001.Black_footed_Albatross/Black_Footed_Albatross_0046_18.txt
- text_c10_nepali/001.Black_footed_Albatross/Black_Footed_Albatross_0009_34.txt
- text_c10_nepali/001.Black_footed_Albatross/Black_Footed_Albatross_0002_55.txt
- text_c10_nepali/001.Black_footed_Albatross/Black_Footed_Albatross_0074_59.txt
- text_c10_nepali/001.Black_footed_Albatross/Black_Footed_Albatross_0014_89.txt
language:
- ne
---
license: apache-2.0 |
unaidedelf87777/slimorca-sem_deduped | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_message
dtype: string
- name: instruction
dtype: string
- name: completion
dtype: string
- name: meta
struct:
- name: topic_depth_1
dtype: string
- name: topic_depth_2
dtype: string
- name: topic_depth_3
dtype: string
splits:
- name: train
num_bytes: 834398137
num_examples: 477358
download_size: 423106996
dataset_size: 834398137
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
WARNING: EXTREMELY WORK IN PROGRESS. NOT YET USEABLE; HAVENT REMOVED RLHF INSTANCES YET. |
benayas/snips_llm_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 5123866
num_examples: 13084
- name: test
num_bytes: 549670
num_examples: 1400
download_size: 761168
dataset_size: 5673536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kwakhyok/high-quality-unsplash-tags | ---
license: mit
---
|
sazirarrwth99/processed_demo | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 11442
num_examples: 3
download_size: 28994
dataset_size: 11442
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mattymchen/cr | ---
language:
- en
task_categories:
- text-classification
task_ids:
- sentiment-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 408668
num_examples: 3775
download_size: 244814
dataset_size: 408668
---
# Dataset Card for "cr"
## Dataset Description
Product review dataset from SentEval.
## Data Fields
- `sentence`: Complete sentence expressing an opinion about a product.
- `label`: Sentiment of the opinion, either "negative" (0) or positive (1).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fun1021183/cvt1_GS3_test3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 633806034.0
num_examples: 3900
- name: test
num_bytes: 385612653.92
num_examples: 2480
download_size: 918457935
dataset_size: 1019418687.9200001
---
# Dataset Card for "cvt1_GS3_test3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
duncsand/english_pii-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3088707
num_examples: 10000
download_size: 1606680
dataset_size: 3088707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
daisr/test | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 5530314.0
num_examples: 5
download_size: 545067
dataset_size: 5530314.0
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MicPie/unpredictable_cluster08 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster08
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster08" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
open-llm-leaderboard/details_Deci__DeciCoder-1b | ---
pretty_name: Evaluation run of Deci/DeciCoder-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Deci/DeciCoder-1b](https://huggingface.co/Deci/DeciCoder-1b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Deci__DeciCoder-1b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T12:47:40.264080](https://huggingface.co/datasets/open-llm-leaderboard/details_Deci__DeciCoder-1b_public/blob/main/results_2023-11-08T12-47-40.264080.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.0002568002749723942,\n \"f1\": 0.02978817114093966,\n\
\ \"f1_stderr\": 0.0009513874747103622,\n \"acc\": 0.26286237271664875,\n\
\ \"acc_stderr\": 0.00882802109541121\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749723942,\n\
\ \"f1\": 0.02978817114093966,\n \"f1_stderr\": 0.0009513874747103622\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \
\ \"acc_stderr\": 0.003605486867998233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5082872928176796,\n \"acc_stderr\": 0.014050555322824189\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Deci/DeciCoder-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T18_16_50.163751
path:
- '**/details_harness|drop|3_2023-11-05T18-16-50.163751.parquet'
- split: 2023_11_08T12_47_40.264080
path:
- '**/details_harness|drop|3_2023-11-08T12-47-40.264080.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T12-47-40.264080.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T18_16_50.163751
path:
- '**/details_harness|gsm8k|5_2023-11-05T18-16-50.163751.parquet'
- split: 2023_11_08T12_47_40.264080
path:
- '**/details_harness|gsm8k|5_2023-11-08T12-47-40.264080.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T12-47-40.264080.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T18_16_50.163751
path:
- '**/details_harness|winogrande|5_2023-11-05T18-16-50.163751.parquet'
- split: 2023_11_08T12_47_40.264080
path:
- '**/details_harness|winogrande|5_2023-11-08T12-47-40.264080.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T12-47-40.264080.parquet'
- config_name: results
data_files:
- split: 2023_11_05T18_16_50.163751
path:
- results_2023-11-05T18-16-50.163751.parquet
- split: 2023_11_08T12_47_40.264080
path:
- results_2023-11-08T12-47-40.264080.parquet
- split: latest
path:
- results_2023-11-08T12-47-40.264080.parquet
---
# Dataset Card for Evaluation run of Deci/DeciCoder-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Deci/DeciCoder-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Deci/DeciCoder-1b](https://huggingface.co/Deci/DeciCoder-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Deci__DeciCoder-1b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T12:47:40.264080](https://huggingface.co/datasets/open-llm-leaderboard/details_Deci__DeciCoder-1b_public/blob/main/results_2023-11-08T12-47-40.264080.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723942,
"f1": 0.02978817114093966,
"f1_stderr": 0.0009513874747103622,
"acc": 0.26286237271664875,
"acc_stderr": 0.00882802109541121
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723942,
"f1": 0.02978817114093966,
"f1_stderr": 0.0009513874747103622
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.003605486867998233
},
"harness|winogrande|5": {
"acc": 0.5082872928176796,
"acc_stderr": 0.014050555322824189
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
alisson40889/globo | ---
license: openrail
---
|
akoukas/autexDetectionEN | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': human
'1': generated
splits:
- name: train
num_bytes: 10758176
num_examples: 33845
- name: test
num_bytes: 7874225
num_examples: 21832
download_size: 11485680
dataset_size: 18632401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
seyyedaliayati/solidity-dataset | ---
dataset_info:
features:
- name: hash
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: is_test
dtype: bool
- name: repo_id
dtype: string
- name: repo_name
dtype: string
- name: repo_head
dtype: string
- name: repo_path
dtype: string
- name: content_tokens
dtype: int64
- name: content_chars
dtype: int64
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5736925269
num_examples: 284112
- name: test
num_bytes: 710770657
num_examples: 35514
- name: eval
num_bytes: 721961344
num_examples: 35514
download_size: 2050339485
dataset_size: 7169657270
license: cc
task_categories:
- text-generation
- text2text-generation
- text-classification
language:
- en
tags:
- solidity
- test case
- smart contract
- ethereum
pretty_name: Solidity Dataset
size_categories:
- 100K<n<1M
---
# Solidity Dataset
## Dataset Description
This dataset is collected from public GitHub repositories written in Solidity programming language.
The list of the repositories is available at [repositories.json](https://huggingface.co/datasets/seyyedaliayati/solidity-dataset/blob/main/repositories.json) file.
It contains useful data about smart contracts written in Solidity along with test cases (and unit tests) written to test smart contracts.
## Dataset Summary
The dataset contains of [355,540 rows](#data-splits) in total. Each row includes the following features:
- `hash` (string): The sha256 hash value of the file content before any pre-processing.
- `size` (integer): File size in bytes.
- `ext` (string): File extention.
- `lang` (string): The name of the programming language that the file is written with. (Solidity or Python or JavaScript)
- `is_test` (bool): Indicates whether this file is test case (test file) or the smart contract main code.
- `repo_id` (string): GitHub's repository identifer fetched from GitHub's API.
- `repo_name` (string): GitHub's repository name.
- `repo_head` (string): The head commit of the repository that the file is fetched.
- `repo_path` (string): Relative file path.
- `content_tokens` (integer): Number of tokens in the file content.
- `content_chars` (integer): Number of characters in the file content.
- `content` (string): File content.
- `__index_level_0__` (integer): Ignore this field please!
## Supported Tasks and Leaderboards
This dataset can be used for tasks related to analyzing smart contracts, test cases in smart contracts, and improving language models on Solidity language.
As of now, there are no specific leaderboards associated with this dataset.
## Languages
- The dataset is in the English language (en).
- Smart contracts (`is_test=false`) are in Solidity programming language.
- Test cases (`is_test=true`) are in Solidity, Python, or JavaScript programming language.
## Data Splits
The dataset is split into three splits:
- `train`: 284112 rows (80% of the dataset)
- `test`: 35514 rows (10% of the dataset)
- 'eval': 35514 rows (10% of the dataset)
## Dataset Creation
The `content_token` is generated via [StarCoderBase tokenizer](https://huggingface.co/bigcode/starcoderbase) using the following code snippet:
```python
from transformers import AutoTokenizer
checkpoint = "bigcode/starcoderbase"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
def count_tokens(code: str) -> int:
tokens = tokenizer.tokenize(code)
return len(tokens)
```
The `is_test` calculated by detecting some regex patterns in the file content. More details will publish soon.
## License
This dataset is released under the [Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license](https://creativecommons.org/licenses/by-nc/4.0/).
## Citation
Please use the following citation when referencing the this dataset:
```
@misc {seyyed_ali_ayati_2023,
author = { {Seyyed Ali Ayati} },
title = { solidity-dataset (Revision 77e80ad) },
year = 2023,
url = { https://huggingface.co/datasets/seyyedaliayati/solidity-dataset },
doi = { 10.57967/hf/0808 },
publisher = { Hugging Face }
}
``` |
omar47/dummy_en_asr | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 13315953.0
num_examples: 60
- name: validation
num_bytes: 3749618.0
num_examples: 40
- name: test
num_bytes: 5333789.0
num_examples: 40
download_size: 21477003
dataset_size: 22399360.0
---
# Dataset Card for "dummy_en_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nithin1995/dfc_sroie_caption_subset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 505493.0
num_examples: 5
download_size: 471183
dataset_size: 505493.0
---
# Dataset Card for "dfc_sroie_caption_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MouezYazidi/ChatGPT_tweets | ---
license: apache-2.0
task_categories:
- text-classification
- summarization
- feature-extraction
size_categories:
- 10K<n<100K
---
Dataset sourced from Twitter, featuring 30,000 rows of multilingual user feedback tweets about ChatGPT. Each row contains text feedback, reflecting diverse user experiences. This dataset, hosted on Hugging Face, provides valuable resources for language analysis and understanding user interactions across different languages. Potential use cases include language modeling, multilingual sentiment analysis, user behavior analysis, and training of machine learning models for natural language processing tasks. |
skrishna/coin_flip | ---
license: mit
---
|
anan-2024/twitter_dataset_1713044296 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22017
num_examples: 50
download_size: 12149
dataset_size: 22017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
librarian-bots/arxiv-metadata-snapshot | ---
language:
- en
license: cc0-1.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- text-classification
pretty_name: arXiv Metadata Dataset
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: versions
list:
- name: version
dtype: string
- name: created
dtype: string
- name: update_date
dtype: timestamp[s]
- name: authors_parsed
sequence:
sequence: string
splits:
- name: train
num_bytes: 3697861871.0
num_examples: 2459562
download_size: 2070637790
dataset_size: 3697861871.0
tags:
- arxiv
- science
---
# Dataset Card for "arxiv-metadata-oai-snapshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
This is a mirror of the metadata portion of the arXiv [dataset](https://www.kaggle.com/datasets/Cornell-University/arxiv/versions/147).
The sync will take place weekly so may fall behind the original datasets slightly if there are more regular updates to the source dataset.
## Metadata
This dataset is a mirror of the original ArXiv data. This dataset contains an entry for each paper, containing:
- id: ArXiv ID (can be used to access the paper, see below)
- submitter: Who submitted the paper
- authors: Authors of the paper
- title: Title of the paper
- comments: Additional info, such as number of pages and figures
- journal-ref: Information about the journal the paper was published in
- doi: [https://www.doi.org](Digital Object Identifier)
- abstract: The abstract of the paper
- categories: Categories / tags in the ArXiv system
- versions: A version history
You can access each paper directly on ArXiv using these links:
- `https://arxiv.org/abs/{id}`: Page for this paper including its abstract and further links
- `https://arxiv.org/pdf/{id}`: Direct link to download the PDF
|
Nexdata/Filipino_Conversational_Speech_Data_by_Mobile_Phone | ---
language:
- tl
task_categories:
- conversational
---
---
# Dataset Card for Nexdata/Filipino_Conversational_Speech_Data_by_Mobile_Phone
## Description
The 104 Hours - Filipino Conversational Speech Data by Mobile Phone collected by phone involved 140 native speakers, developed with proper balance of gender ratio, Speakers would choose a few familiar topics out of the given list and start conversations to ensure dialogues' fluency and naturalness. The recording devices are various mobile phones. The audio format is 16kHz, 16bit, uncompressed WAV, and all the speech data was recorded in quiet indoor environments. All the speech audio was manually transcribed with text content, the start and end time of each effective sentence, and speaker identification.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1238?source=Huggingface
# Specifications
## Format
16kHz 16bit, uncompressed wav, mono channel;
## Environment
quiet indoor environment, without echo;
## Recording content
dozens of topics are specified, and the speakers make dialogue under those topics while the recording is performed;
## Demographics
140 speakers totally, with 52% male and 48% female;
## Annotation
annotating for the transcription text, speaker identification and gender
## Device
Android mobile phone, iPhone;
## Language
Filipino;
## Application scenarios
speech recognition; voiceprint recognition;
## Accuracy rate
the word accuracy rate is not less than 98%
# Licensing Information
Commercial License |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_A_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 864450
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 864450
num_examples: 1000
- name: fewshot_0
num_bytes: 893658
num_examples: 1000
download_size: 428473
dataset_size: 2622558
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_A_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.