datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jorgeortizfuentes/spanish_attitude_lsf | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: tokens
sequence: string
- name: attitude_tags
sequence:
class_label:
names:
'0': O
'1': judgment
'2': appreciation
'3': affect
- name: types_jugdment_tags
sequence:
class_label:
names:
'0': O
'1': social esteem
'2': social sanction
- name: subtypes_jugdment_tags
sequence:
class_label:
names:
'0': O
'1': tenacity
'2': propriety
'3': normality
'4': veracity
'5': capacity
splits:
- name: train
num_bytes: 3057071
num_examples: 1782
- name: validation
num_bytes: 649234
num_examples: 382
- name: test
num_bytes: 630438
num_examples: 382
download_size: 982636
dataset_size: 4336743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
wza/finccf | ---
license: apache-2.0
---
|
AdapterOcean/math_dataset_standardized_cluster_4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 47907460
num_examples: 5005
download_size: 13098592
dataset_size: 47907460
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_dataset_standardized_cluster_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PlenitudeAI/simpsons_prompt_lines | ---
dataset_info:
features:
- name: previous
dtype: string
- name: character
dtype: string
- name: line
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 191013022
num_examples: 121841
download_size: 0
dataset_size: 191013022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "simpsons_prompt_lines"

I used the [Simpsons](https://www.kaggle.com/datasets/prashant111/the-simpsons-dataset?resource=download&select=simpsons_episodes.csv) Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)
I got the idea and part of the code from this [blog post](https://replicate.com/blog/fine-tune-llama-to-speak-like-homer-simpson) from Replicate.
This can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !
### Example
```json
{
"previous": "Marge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.",
"character": "Homer Simpson",
"line": "Just give me ten more hours.",
"text": "<s> [INST] Below is a script from the American animated sitcom The Simpsons. Write a response that completes Homer Simpson's last line in the conversation. \n\nMarge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.\nHomer Simpson: [/INST] Just give me ten more hours. </s>"
}
```
### Characters
- Homer Simpson
- Bart Simpson
- Marge Simpson
- Lisa Simpson
- C. Montgomery Burns
- Seymour Skinner
- Moe Szyslak
- Ned Flanders
- Grampa Simpson
- Krusty the Clown
- Chief Wiggum
- Milhouse Van Houten
- Waylon Smithers
- Apu Nahasapeemapetilon
- Kent Brockman
- Nelson Muntz
- Barney Gumble
- Lenny Leonard
- Edna Krabappel-Flanders
- Sideshow Bob
- Dr. Julius Hibbert
- Selma Bouvier
- Ralph Wiggum
- Rev. Timothy Lovejoy
- Crowd
- Carl Carlson
- Patty Bouvier
- Mayor Joe Quimby
- Otto Mann
- Groundskeeper Willie
- Martin Prince
- Announcer
- Comic Book Guy
- Kids
- Lionel Hutz
- HERB
- Sideshow Mel
- Gary Chalmers
- Professor Jonathan Frink
- Jimbo Jones
- Lou
- Todd Flanders
- Miss Hoover
- Agnes Skinner
- Maude Flanders
- Troy McClure
- Fat Tony
- Snake Jailbird
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheFinAI/flare-finarg-ecc-auc | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 549300
num_examples: 969
download_size: 177802
dataset_size: 549300
---
# Dataset Card for "flare-finarg-ecc-auc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nblinh63/twitter_dataset_1712687979 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 78974
num_examples: 200
download_size: 37277
dataset_size: 78974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
notrichardren/azaria-mitchell-diff-filtered | ---
configs:
- config_name: default
data_files:
- split: cities
path: data/cities-*
- split: companies
path: data/companies-*
- split: animals
path: data/animals-*
- split: elements
path: data/elements-*
- split: inventions
path: data/inventions-*
- split: facts
path: data/facts-*
dataset_info:
features:
- name: claim
dtype: string
- name: label
dtype: int64
- name: dataset
dtype: string
- name: qa_type
dtype: int64
- name: ind
dtype: int64
splits:
- name: cities
num_bytes: 7955
num_examples: 112
- name: companies
num_bytes: 14588
num_examples: 129
- name: animals
num_bytes: 11451
num_examples: 137
- name: elements
num_bytes: 11617
num_examples: 139
- name: inventions
num_bytes: 10559
num_examples: 127
- name: facts
num_bytes: 14809
num_examples: 159
download_size: 44699
dataset_size: 70979
---
# Dataset Card for "azaria-mitchell-diff-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_244 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 940033572
num_examples: 183171
download_size: 960820801
dataset_size: 940033572
---
# Dataset Card for "chunk_244"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
somosnlp/LingComp_QA | ---
license: cc-by-nc-sa-4.0
task_categories:
- question-answering
language:
- es
tags:
- computational linguistics
- spanish
- NLP
- json
size_categories:
- 1K<n<10K
---
# Dataset Card de Lingcomp_QA
<!-- Provide a quick summary of the dataset. -->
Este es un dataset creado a partir de blogs de internet y páginas en abierto sobre lingüística computacional. Contiene algunos temas de estadística, lingüística e informática
(especialmente cuestiones relacionadas con Python, el principal lenguaje de programación para el procesamiento del lenguaje natural).
## Detalles del dataset
### Descripción del dataset
<!-- Provide a longer summary of what this dataset is. -->
Este dataset tiene la intención de ser educativo, explicando ciertos conceptos o respondiendo preguntas relacionadas con la evolución de la disciplina o de las funciones y aspectos
básicos de Python y algunos paquetes como Wordnet o NLTK. También responde algunas cuestiones de lingüística de corpus y, por tanto, de estadística, especialmente cuestiones
de explicación de conceptos. Actualmente tiene 1004 pares de pregunta-respuesta.
- **Recogido por:**
(Universidad de Cádiz, Instituto de Lingüística Aplicada)
- [Jorge Zamora Rey] (https://huggingface.co/reddrex)
- [Isabel Moyano Moreno] (https://huggingface.co/issyinthesky)
- [Mario Crespo Miguel] (https://huggingface.co/MCMiguel)
- **Language(s) (NLP):** Spanish
- **License:** Creative Commons Attribution Non Commercial Share Alike 4.0
### Formato del dataset
El dataset tiene la siguiente forma:
```json
[
{
"pregunta": "¿Qué implica la lingüística computacional teórica?",
"respuesta": "La lingüística computacional teórica incluye el desarrollo de teorías formales de gramática y semántica, basadas en lógicas formales o enfoques simbólicos. Las áreas de estudio teórico en este ámbito incluyen la complejidad computacional y la semántica computacional."
},
{
"pregunta": "¿Qué es una gramática libre de contexto?",
"respuesta": "Una gramática libre de contexto es una gramática formal en la que cada regla de producción es de la forma V → w, donde V es un símbolo no terminal y w es una cadena de terminales y/o no terminales."
},
{
"pregunta": "¿Qué es el algoritmo CYK y cuál es su propósito?",
"respuesta": "El algoritmo de Cocke-Younger-Kasami (CYK) es un algoritmo de análisis sintáctico ascendente que determina si una cadena puede ser generada por una gramática libre de contexto y, en caso afirmativo, cómo puede ser generada. Su propósito es realizar un análisis sintáctico de la cadena para determinar su estructura gramatical."
},
{...}
]
```
### Uso del dataset
Nuestra intención principal de cara al futuro es aumentarlo con otras fuentes de información y emplearlo para crear un agente conversacional que ayude a los alumnos
de las asignaturas de Lingüística computacional e Ingeniería del Lenguaje del grado de Lingüística y lenguas aplicadas, además de otras personas interesadas en este ámbito,
aunque nuestro foco principal son lingüistas, toda persona interesada es bienvenida.
### Links del dataset
<!-- Provide the basic links for the dataset. -->
- **Repositorio:** https://github.com/reddrex/lingcomp_QA/tree/main
<!-- Paper [optional]:** [More Information Needed] -->
## Creación del dataset
Para este dataset hemos empleado los apuntes de la asignatura de Lingüística computacional del grado de Lingüística y lenguas aplicadas de la universidad de Cádiz, y algunas
cuestiones de la asignatura de Ingeniería del lenguaje. Además, hemos añadido la información que ha aparecido en páginas web en español, especialmente blogs, encontradas a través de Bootcat
al realizar una búsqueda de términos de la lingüística computacional. Estos términos los hemos elegido de los principales temas tratados en libros específicos como el de Jurafsky y Martin de Speech and Language Processing.
Herramientas: Bootcat para la extracción de .txt de webs, Sublime text para la organización en preguntas y respuestas en JSON y la limpieza con regex.
### Fuentes de información
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
Está conformado por preguntas y respuestas formuladas a partir de los apuntes de las asignaturas de las asignaturas mencionadas anteriormente, artículos de wikipedia y blogs
que tratan información relacionada a términos de la lingüística computacional , o explican cómo utilizar ciertos paquetes o programar en Python.
La información recogida está relacionada mayoritariamente con los siguientes conceptos:
- Algoritmos y formalismos
- Lenguaje de programación
- CPU/GPU
- Entornos como colab o jupyter
- Python: tipos de datos, funciones built-in, métodos, programación orientada a objetos, comprensión de listas, etc.
- NLTK
- SpaCy
- Historia y evolución del PLN
- PLN/Lingüística computacional (sintaxis y semántica computacional, diferencias, conceptos...)
- Lingüística
- Recursos como FrameNet, WordNet, Treebank, Corpus Brown, ontologías
- Lingüística de corpus: concordancias, colocaciones, cuestiones de estadística (chi-cuadrado, log-likelihood, datos, muestreo...)
## Sesgos, Riesgos, y Limitaciones
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Ha habido cierto límite de tiempo en el que es especialmente difícil recabar tanta información como la que puede haber en un campo como la lingüística computacional, así que
nos hemos limitado a cubrir ciertos temas básicos relacionados con este ámbito. Además, otro problema es la escasez de información en abierto disponible para crear un corpus,
ya que la mayoría de información que hemos encontrado en relación a estos temas pertenecía a artículos científicos.
<!-- ## Citation -->
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> |
dim/horoscopes_ru_1k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 952167
num_examples: 1000
download_size: 462523
dataset_size: 952167
---
# Dataset Card for "horoscopes_ru_1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
s3prl/SNIPS | ---
license: mit
---
|
Goorm-AI-04/DroneRF | ---
license: unknown
---
|
KYKNZ/SDXL | ---
license: cc0-1.0
---
|
liuyanchen1015/MULTI_VALUE_wnli_aint_be | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1372
num_examples: 7
- name: train
num_bytes: 11692
num_examples: 63
download_size: 10541
dataset_size: 13064
---
# Dataset Card for "MULTI_VALUE_wnli_aint_be"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zm87/dushu | ---
license: openrail
---
|
andrewlee1807/Gyeonggi | ---
license: apache-2.0
task_categories:
- time-series-forecasting
language:
- en
tags:
- electricity
size_categories:
- 100M<n<1B
---
## Dataset Description
<table border="0">
<tr>
<td style="width: 40%; vertical-align: top">
Gyeonggi dataset is 10,000 households based on the highest meter reading rate for all branches of the around in Gyeonggi Province, South Korea. For privacy reasons, the name of the household is not provided. We only provide the ID of the household.
</td>
<td>
<img src="imgs/gy-map.png" >
</td>
</tr>
</table>
### Dataset Summary
This dataset en-compasses hourly records of building power consumption spanning approximately 1.9 years, ranging from January 1, 2021, to January 14, 2022.
| electrical-meter-id | date | hour | customer-id | amount-of-consumption |
|---------------------|----------|------|-------------|-----------------------|
| 7871 | 20201020 | 1 | 7871 | 4.25 |
| 7871 | 20201020 | 2 | 7871 | 4.12 |
| 7871 | 20201020 | 3 | 7871 | 4.08 |
| 7871 | 20201020 | 4 | 7871 | 4.03 |
| 7871 | 20201020 | 5 | 7871 | 4.09 |
#### Our experiment focuses on the total electricity consumption of a particular ID 6499
---
license: apache-2.0
--- |
mask-distilled-one-sec-cv12/chunk_257 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1098252744
num_examples: 215682
download_size: 1118673550
dataset_size: 1098252744
---
# Dataset Card for "chunk_257"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seongill/squad_conflict_v2_under_150 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float64
- name: answer_sentence
dtype: string
- name: is_named_entity
dtype: bool
- name: ent_type
dtype: string
- name: num_words
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 192599442
num_examples: 26799
download_size: 142312600
dataset_size: 192599442
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
abhika-m/fava-flagged-demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alvations/c4p0-x1-en-engb | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
splits:
- name: train
num_bytes: 5583
num_examples: 5
download_size: 17399
dataset_size: 5583
configs:
- config_name: default
data_files:
- split: train
path: 5eeb99e4b632b370/train-*
---
|
cleexiang/chat_unsensored | ---
license: apache-2.0
task_categories:
- question-answering
language:
- ar
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b | ---
pretty_name: Evaluation run of ibivibiv/aegolius-acadicus-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ibivibiv/aegolius-acadicus-30b](https://huggingface.co/ibivibiv/aegolius-acadicus-30b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T08:41:28.082474](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b/blob/main/results_2024-01-25T08-41-28.082474.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6566791267920726,\n\
\ \"acc_stderr\": 0.03204461446226675,\n \"acc_norm\": 0.6559064592526,\n\
\ \"acc_norm_stderr\": 0.032719772118023696,\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6707176642401714,\n\
\ \"mc2_stderr\": 0.015136561645539275\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\
\ \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7103166699860586,\n\
\ \"acc_stderr\": 0.004526883021027629,\n \"acc_norm\": 0.880103565026887,\n\
\ \"acc_norm_stderr\": 0.0032417650929121374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723302,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.016568971233548606,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.016568971233548606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863937,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863937\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6707176642401714,\n\
\ \"mc2_stderr\": 0.015136561645539275\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479672\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954772\n }\n}\n```"
repo_url: https://huggingface.co/ibivibiv/aegolius-acadicus-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|arc:challenge|25_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|arc:challenge|25_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|gsm8k|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|gsm8k|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hellaswag|10_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hellaswag|10_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-40-13.766236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-41-28.082474.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T08-41-28.082474.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- '**/details_harness|winogrande|5_2024-01-25T08-40-13.766236.parquet'
- split: 2024_01_25T08_41_28.082474
path:
- '**/details_harness|winogrande|5_2024-01-25T08-41-28.082474.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T08-41-28.082474.parquet'
- config_name: results
data_files:
- split: 2024_01_25T08_40_13.766236
path:
- results_2024-01-25T08-40-13.766236.parquet
- split: 2024_01_25T08_41_28.082474
path:
- results_2024-01-25T08-41-28.082474.parquet
- split: latest
path:
- results_2024-01-25T08-41-28.082474.parquet
---
# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-30b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibivibiv/aegolius-acadicus-30b](https://huggingface.co/ibivibiv/aegolius-acadicus-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T08:41:28.082474](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b/blob/main/results_2024-01-25T08-41-28.082474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6566791267920726,
"acc_stderr": 0.03204461446226675,
"acc_norm": 0.6559064592526,
"acc_norm_stderr": 0.032719772118023696,
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6707176642401714,
"mc2_stderr": 0.015136561645539275
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.7103166699860586,
"acc_stderr": 0.004526883021027629,
"acc_norm": 0.880103565026887,
"acc_norm_stderr": 0.0032417650929121374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723302,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.016568971233548606,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.016568971233548606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863937,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863937
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6707176642401714,
"mc2_stderr": 0.015136561645539275
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479672
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maghwa/OpenHermes-2-AR-10K-47-910k-920k | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: 'null'
- name: topic
dtype: 'null'
- name: hash
dtype: 'null'
- name: model
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
dtype: string
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: 'null'
- name: language
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: category
dtype: 'null'
- name: views
dtype: float64
splits:
- name: train
num_bytes: 24571460
num_examples: 10001
download_size: 9747541
dataset_size: 24571460
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
flax-sentence-embeddings/stackexchange_titlebody_best_and_down_voted_answer_jsonl | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
pretty_name: stackexchange
size_categories:
- unknown
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- closed-domain-qa
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)s
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [stackexchange](https://archive.org/details/stackexchange)
- **Repository:** [flax-sentence-embeddings](https://github.com/nreimers/flax-sentence-embeddings)
### Dataset Summary
We automatically extracted question and answer (Q&A) pairs from [Stack Exchange](https://stackexchange.com/) network. Stack Exchange gather many Q&A communities across 50 online plateform, including the well known Stack Overflow and other technical sites. 100 millon developpers consult Stack Exchange every month. The dataset is a parallel corpus with each question mapped to the top rated answer. The dataset is split given communities which cover a variety of domains from 3d printing, economics, raspberry pi or emacs. An exhaustive list of all communities is available [here](https://stackexchange.com/sites).
### Languages
Stack Exchange mainly consist of english language (en).
## Dataset Structure
### Data Instances
Each data samples is presented as follow:
```
{'title_body': "Is there a Stack Exchange icon available? StackAuth /sites route provides all the site's icons except for the one of the Stack Exchange master site.\nCould you please provide it in some way (a static SVG would be good)?",
'upvoted_answer': 'Here it is!\n\nDead link: SVG version here\nNote: the same restrictions on this trademarked icon that apply here, also apply to the icon above.',
'downvoted_answer': 'No, the /sites route is not the right place for that.\n\n/sites enumerates all websites that expose API end-points. StackExchange.com does not expose such an endpoint, so it does not (and will not) appear in the results.'}
```
This particular exampe corresponds to the [following page](https://stackapps.com/questions/1508/is-there-a-stack-exchange-icon-available)
### Data Fields
The fields present in the dataset contain the following informations:
- `title_body`: This is the concatenation of the title and body from the question
- `upvoted_answer`: This is the body from the most upvoted answer
- `downvoted_answer`: This is the body from the most downvoted answer
### Data Splits
We provide multiple splits for this dataset, which each refers to a given community channel. We detail the number of pail for each split below:
| | Number of pairs |
| ----- | ------ |
| english | 13,003 |
| academia | 2,465 |
| christianity | 1,502 |
| apple | 6,696 |
| electronics | 4,014 |
| gaming | 7,321 |
| askubuntu | 9,975 |
| ell | 4,438 |
| hermeneutics | 1,719 |
| judaism | 2,216 |
| diy | 2,037 |
| law | 1,297 |
| history | 1,099 |
| islam | 2,037 |
| dba | 2,502 |
| cooking | 2,064 |
| gamedev | 1,598 |
| drupal | 1,714 |
| chemistry | 1,523 |
| android | 2,830 |
| mathoverflow | 1,109 |
| magento | 1,849 |
| buddhism | 770 |
| gis | 1,843 |
| graphicdesign | 1,565 |
| codereview | 666 |
| aviation | 903 |
| bicycles | 984 |
| japanese | 1,124 |
| cs | 936 |
| german | 1,047 |
| interpersonal | 469 |
| biology | 832 |
| bitcoin | 1,068 |
| blender | 1,312 |
| crypto | 595 |
| anime | 802 |
| boardgames | 691 |
| hinduism | 343 |
| french | 632 |
| fitness | 567 |
| economics | 441 |
| chinese | 611 |
| codegolf | 333 |
| linguistics | 442 |
| astronomy | 371 |
| arduino | 595 |
| chess | 402 |
| cstheory | 314 |
| ja | 328 |
| martialarts | 254 |
| mathematica | 262 |
| dsp | 387 |
| ethereum | 479 |
| health | 299 |
| cogsci | 221 |
| earthscience | 229 |
| gardening | 210 |
| datascience | 325 |
| literature | 191 |
| matheducators | 177 |
| lifehacks | 316 |
| engineering | 227 |
| ham | 158 |
| 3dprinting | 109 |
| italian | 181 |
| emacs | 188 |
| homebrew | 176 |
| ai | 130 |
| avp | 152 |
| expatriates | 132 |
| elementaryos | 224 |
| cseducators | 67 |
| hsm | 70 |
| expressionengine | 91 |
| joomla | 124 |
| freelancing | 70 |
| crafts | 72 |
| genealogy | 86 |
| latin | 55 |
| hardwarerecs | 58 |
| devops | 53 |
| coffee | 47 |
| beer | 57 |
| languagelearning | 42 |
| ebooks | 54 |
| bricks | 79 |
| civicrm | 85 |
| bioinformatics | 39 |
| esperanto | 56 |
| computergraphics | 30 |
| conlang | 8 |
| korean | 28 |
| iota | 31 |
| eosio | 44 |
| craftcms | 26 |
| iot | 10 |
| drones | 6 |
| cardano | 7 |
| materials | 1 |
| ru | 6,305 |
| softwareengineering | 4,238 |
| scifi | 5,176 |
| workplace | 4,317 |
| serverfault | 7,969 |
| rpg | 4,212 |
| physics | 8,362 |
| superuser | 17,425 |
| worldbuilding | 2,087 |
| security | 3,069 |
| pt | 3,718 |
| unix | 6,173 |
| meta | 61 |
| politics | 1,468 |
| stats | 2,238 |
| movies | 1,577 |
| photo | 1,432 |
| wordpress | 3,046 |
| music | 1,228 |
| philosophy | 1,184 |
| skeptics | 670 |
| money | 1,905 |
| salesforce | 1,781 |
| parenting | 624 |
| raspberrypi | 1,011 |
| travel | 1,317 |
| mechanics | 842 |
| tex | 1,095 |
| ux | 1,107 |
| sharepoint | 1,691 |
| webapps | 1,906 |
| puzzling | 784 |
| networkengineering | 476 |
| webmasters | 854 |
| sports | 455 |
| rus | 514 |
| space | 405 |
| writers | 407 |
| pets | 322 |
| pm | 241 |
| russian | 353 |
| spanish | 366 |
| sound | 365 |
| quant | 340 |
| sqa | 353 |
| outdoors | 221 |
| softwarerecs | 348 |
| retrocomputing | 135 |
| mythology | 103 |
| portuguese | 144 |
| opensource | 123 |
| scicomp | 127 |
| ukrainian | 87 |
| patents | 137 |
| sustainability | 152 |
| poker | 115 |
| robotics | 110 |
| woodworking | 93 |
| reverseengineering | 97 |
| sitecore | 122 |
| tor | 137 |
| vi | 95 |
| windowsphone | 153 |
| vegetarianism | 35 |
| moderators | 23 |
| quantumcomputing | 46 |
| musicfans | 78 |
| tridion | 68 |
| opendata | 45 |
| tezos | 11 |
| stellar | 3 |
| or | 13 |
| monero | 26 |
| stackapps | 15 |
| total | 210,748 |
## Dataset Creation
### Curation Rationale
We primary designed this dataset for sentence embeddings training. Indeed sentence embeddings may be trained using a contrastive learning setup for which the model is trained to associate each sentence with its corresponding pair out of multiple proposition. Such models require many examples to be efficient and thus the dataset creation may be tedious. Community networks such as Stack Exchange allow us to build many examples semi-automatically.
### Source Data
The source data are dumps from [Stack Exchange](https://archive.org/details/stackexchange)
#### Initial Data Collection and Normalization
We collected the data from the math community.
We filtered out questions which title or body length is bellow 20 characters and questions for which body length is above 4096 characters.
When extracting most upvoted answer, we filtered to pairs for which their is at least 100 votes gap between most upvoted and downvoted answers.
#### Who are the source language producers?
Questions and answers are written by the community developpers of Stack Exchange.
## Additional Information
### Licensing Information
Please see the license information at: https://archive.org/details/stackexchange
### Citation Information
```
@misc{StackExchangeDataset,
author = {Flax Sentence Embeddings Team},
title = {Stack Exchange question pairs},
year = {2021},
howpublished = {https://huggingface.co/datasets/flax-sentence-embeddings/},
}
```
### Contributions
Thanks to the Flax Sentence Embeddings team for adding this dataset. |
mycode-lucky321/Ruhan_mini-platypus-one | ---
license: mit
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 31386060
num_examples: 24895
download_size: 15599439
dataset_size: 31386060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
erhwenkuo/ceval-exam-zhtw | ---
dataset_info:
- config_name: accountant
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 177004
num_examples: 443
- name: val
num_bytes: 19555
num_examples: 49
- name: dev
num_bytes: 3414
num_examples: 5
download_size: 151561
dataset_size: 199973
- config_name: advanced_mathematics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 50031
num_examples: 173
- name: val
num_bytes: 5331
num_examples: 19
- name: dev
num_bytes: 7021
num_examples: 5
download_size: 50945
dataset_size: 62383
- config_name: art_studies
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 41230
num_examples: 298
- name: val
num_bytes: 4581
num_examples: 33
- name: dev
num_bytes: 1439
num_examples: 5
download_size: 46573
dataset_size: 47250
- config_name: basic_medicine
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 28820
num_examples: 175
- name: val
num_bytes: 2627
num_examples: 19
- name: dev
num_bytes: 1825
num_examples: 5
download_size: 37502
dataset_size: 33272
- config_name: business_administration
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 78396
num_examples: 301
- name: val
num_bytes: 9225
num_examples: 33
- name: dev
num_bytes: 3155
num_examples: 5
download_size: 75404
dataset_size: 90776
- config_name: chinese_language_and_literature
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 32328
num_examples: 209
- name: val
num_bytes: 3446
num_examples: 23
- name: dev
num_bytes: 1892
num_examples: 5
download_size: 43537
dataset_size: 37666
- config_name: civil_servant
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 181519
num_examples: 429
- name: val
num_bytes: 21273
num_examples: 47
- name: dev
num_bytes: 4576
num_examples: 5
download_size: 180536
dataset_size: 207368
- config_name: clinical_medicine
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 42161
num_examples: 200
- name: val
num_bytes: 4167
num_examples: 22
- name: dev
num_bytes: 1951
num_examples: 5
download_size: 48783
dataset_size: 48279
- config_name: college_chemistry
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 45801
num_examples: 224
- name: val
num_bytes: 4443
num_examples: 24
- name: dev
num_bytes: 3611
num_examples: 5
download_size: 53682
dataset_size: 53855
- config_name: college_economics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 119746
num_examples: 497
- name: val
num_bytes: 14461
num_examples: 55
- name: dev
num_bytes: 3673
num_examples: 5
download_size: 106480
dataset_size: 137880
- config_name: college_physics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 55731
num_examples: 176
- name: val
num_bytes: 6145
num_examples: 19
- name: dev
num_bytes: 3824
num_examples: 5
download_size: 62806
dataset_size: 65700
- config_name: college_programming
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 84024
num_examples: 342
- name: val
num_bytes: 9615
num_examples: 37
- name: dev
num_bytes: 2900
num_examples: 5
download_size: 83274
dataset_size: 96539
- config_name: computer_architecture
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 41173
num_examples: 193
- name: val
num_bytes: 4188
num_examples: 21
- name: dev
num_bytes: 2841
num_examples: 5
download_size: 48203
dataset_size: 48202
- config_name: computer_network
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 35495
num_examples: 171
- name: val
num_bytes: 3814
num_examples: 19
- name: dev
num_bytes: 2364
num_examples: 5
download_size: 43988
dataset_size: 41673
- config_name: discrete_mathematics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 36057
num_examples: 153
- name: val
num_bytes: 3424
num_examples: 16
- name: dev
num_bytes: 2002
num_examples: 5
download_size: 43029
dataset_size: 41483
- config_name: education_science
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 55756
num_examples: 270
- name: val
num_bytes: 5522
num_examples: 29
- name: dev
num_bytes: 3093
num_examples: 5
download_size: 59946
dataset_size: 64371
- config_name: electrical_engineer
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 73769
num_examples: 339
- name: val
num_bytes: 8327
num_examples: 37
- name: dev
num_bytes: 2180
num_examples: 5
download_size: 74147
dataset_size: 84276
- config_name: environmental_impact_assessment_engineer
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 84701
num_examples: 281
- name: val
num_bytes: 9186
num_examples: 31
- name: dev
num_bytes: 2495
num_examples: 5
download_size: 73813
dataset_size: 96382
- config_name: fire_engineer
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 83743
num_examples: 282
- name: val
num_bytes: 10016
num_examples: 31
- name: dev
num_bytes: 2209
num_examples: 5
download_size: 82070
dataset_size: 95968
- config_name: high_school_biology
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 55242
num_examples: 175
- name: val
num_bytes: 6105
num_examples: 19
- name: dev
num_bytes: 2164
num_examples: 5
download_size: 60835
dataset_size: 63511
- config_name: high_school_chemistry
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 46918
num_examples: 172
- name: val
num_bytes: 5625
num_examples: 19
- name: dev
num_bytes: 2576
num_examples: 5
download_size: 55719
dataset_size: 55119
- config_name: high_school_chinese
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 110380
num_examples: 178
- name: val
num_bytes: 10475
num_examples: 19
- name: dev
num_bytes: 5290
num_examples: 5
download_size: 120269
dataset_size: 126145
- config_name: high_school_geography
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 41232
num_examples: 178
- name: val
num_bytes: 3985
num_examples: 19
- name: dev
num_bytes: 2087
num_examples: 5
download_size: 50092
dataset_size: 47304
- config_name: high_school_history
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 56205
num_examples: 182
- name: val
num_bytes: 6624
num_examples: 20
- name: dev
num_bytes: 2421
num_examples: 5
download_size: 68561
dataset_size: 65250
- config_name: high_school_mathematics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 41095
num_examples: 166
- name: val
num_bytes: 5144
num_examples: 18
- name: dev
num_bytes: 3552
num_examples: 5
download_size: 53179
dataset_size: 49791
- config_name: high_school_physics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 61682
num_examples: 175
- name: val
num_bytes: 7266
num_examples: 19
- name: dev
num_bytes: 2266
num_examples: 5
download_size: 66481
dataset_size: 71214
- config_name: high_school_politics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 83428
num_examples: 176
- name: val
num_bytes: 8912
num_examples: 19
- name: dev
num_bytes: 4730
num_examples: 5
download_size: 90433
dataset_size: 97070
- config_name: ideological_and_moral_cultivation
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 35315
num_examples: 172
- name: val
num_bytes: 3241
num_examples: 19
- name: dev
num_bytes: 1296
num_examples: 5
download_size: 41159
dataset_size: 39852
- config_name: law
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 79806
num_examples: 221
- name: val
num_bytes: 8119
num_examples: 24
- name: dev
num_bytes: 4142
num_examples: 5
download_size: 83236
dataset_size: 92067
- config_name: legal_professional
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 122000
num_examples: 215
- name: val
num_bytes: 12215
num_examples: 23
- name: dev
num_bytes: 6974
num_examples: 5
download_size: 125256
dataset_size: 141189
- config_name: logic
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 144288
num_examples: 204
- name: val
num_bytes: 15558
num_examples: 22
- name: dev
num_bytes: 5641
num_examples: 5
download_size: 142564
dataset_size: 165487
- config_name: mao_zedong_thought
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 56708
num_examples: 219
- name: val
num_bytes: 5487
num_examples: 24
- name: dev
num_bytes: 3352
num_examples: 5
download_size: 57948
dataset_size: 65547
- config_name: marxism
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 38674
num_examples: 179
- name: val
num_bytes: 4251
num_examples: 19
- name: dev
num_bytes: 2142
num_examples: 5
download_size: 44933
dataset_size: 45067
- config_name: metrology_engineer
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 47544
num_examples: 219
- name: val
num_bytes: 6134
num_examples: 24
- name: dev
num_bytes: 2485
num_examples: 5
download_size: 54828
dataset_size: 56163
- config_name: middle_school_biology
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 47267
num_examples: 192
- name: val
num_bytes: 5263
num_examples: 21
- name: dev
num_bytes: 4327
num_examples: 5
download_size: 58472
dataset_size: 56857
- config_name: middle_school_chemistry
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 47575
num_examples: 185
- name: val
num_bytes: 5654
num_examples: 20
- name: dev
num_bytes: 3866
num_examples: 5
download_size: 59099
dataset_size: 57095
- config_name: middle_school_geography
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 23332
num_examples: 108
- name: val
num_bytes: 2641
num_examples: 12
- name: dev
num_bytes: 2148
num_examples: 5
download_size: 37389
dataset_size: 28121
- config_name: middle_school_history
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 47076
num_examples: 207
- name: val
num_bytes: 5990
num_examples: 22
- name: dev
num_bytes: 2014
num_examples: 5
download_size: 56042
dataset_size: 55080
- config_name: middle_school_mathematics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 33142
num_examples: 177
- name: val
num_bytes: 4897
num_examples: 19
- name: dev
num_bytes: 3187
num_examples: 5
download_size: 44657
dataset_size: 41226
- config_name: middle_school_physics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 48796
num_examples: 178
- name: val
num_bytes: 5279
num_examples: 19
- name: dev
num_bytes: 3531
num_examples: 5
download_size: 59820
dataset_size: 57606
- config_name: middle_school_politics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 72499
num_examples: 193
- name: val
num_bytes: 7326
num_examples: 21
- name: dev
num_bytes: 3687
num_examples: 5
download_size: 76847
dataset_size: 83512
- config_name: modern_chinese_history
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 51247
num_examples: 212
- name: val
num_bytes: 5188
num_examples: 23
- name: dev
num_bytes: 2983
num_examples: 5
download_size: 59728
dataset_size: 59418
- config_name: operating_system
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 31467
num_examples: 179
- name: val
num_bytes: 3335
num_examples: 19
- name: dev
num_bytes: 2611
num_examples: 5
download_size: 40349
dataset_size: 37413
- config_name: physician
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 89819
num_examples: 443
- name: val
num_bytes: 8713
num_examples: 49
- name: dev
num_bytes: 2033
num_examples: 5
download_size: 91464
dataset_size: 100565
- config_name: plant_protection
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 31877
num_examples: 199
- name: val
num_bytes: 3634
num_examples: 22
- name: dev
num_bytes: 3726
num_examples: 5
download_size: 42813
dataset_size: 39237
- config_name: probability_and_statistics
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 56749
num_examples: 166
- name: val
num_bytes: 5781
num_examples: 18
- name: dev
num_bytes: 6769
num_examples: 5
download_size: 63258
dataset_size: 69299
- config_name: professional_tour_guide
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 41231
num_examples: 266
- name: val
num_bytes: 4509
num_examples: 29
- name: dev
num_bytes: 1764
num_examples: 5
download_size: 51642
dataset_size: 47504
- config_name: sports_science
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 32536
num_examples: 180
- name: val
num_bytes: 3493
num_examples: 19
- name: dev
num_bytes: 4182
num_examples: 5
download_size: 45905
dataset_size: 40211
- config_name: tax_accountant
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 174509
num_examples: 443
- name: val
num_bytes: 18938
num_examples: 49
- name: dev
num_bytes: 4274
num_examples: 5
download_size: 148037
dataset_size: 197721
- config_name: teacher_qualification
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 107372
num_examples: 399
- name: val
num_bytes: 12220
num_examples: 44
- name: dev
num_bytes: 3212
num_examples: 5
download_size: 105439
dataset_size: 122804
- config_name: urban_and_rural_planner
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 110473
num_examples: 418
- name: val
num_bytes: 12793
num_examples: 46
- name: dev
num_bytes: 3184
num_examples: 5
download_size: 101932
dataset_size: 126450
- config_name: veterinary_medicine
features:
- name: id
dtype: int32
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: explanation
dtype: string
splits:
- name: test
num_bytes: 39465
num_examples: 210
- name: val
num_bytes: 4562
num_examples: 23
- name: dev
num_bytes: 2365
num_examples: 5
download_size: 48753
dataset_size: 46392
configs:
- config_name: accountant
data_files:
- split: test
path: accountant/test-*
- split: val
path: accountant/val-*
- split: dev
path: accountant/dev-*
- config_name: advanced_mathematics
data_files:
- split: test
path: advanced_mathematics/test-*
- split: val
path: advanced_mathematics/val-*
- split: dev
path: advanced_mathematics/dev-*
- config_name: art_studies
data_files:
- split: test
path: art_studies/test-*
- split: val
path: art_studies/val-*
- split: dev
path: art_studies/dev-*
- config_name: basic_medicine
data_files:
- split: test
path: basic_medicine/test-*
- split: val
path: basic_medicine/val-*
- split: dev
path: basic_medicine/dev-*
- config_name: business_administration
data_files:
- split: test
path: business_administration/test-*
- split: val
path: business_administration/val-*
- split: dev
path: business_administration/dev-*
- config_name: chinese_language_and_literature
data_files:
- split: test
path: chinese_language_and_literature/test-*
- split: val
path: chinese_language_and_literature/val-*
- split: dev
path: chinese_language_and_literature/dev-*
- config_name: civil_servant
data_files:
- split: test
path: civil_servant/test-*
- split: val
path: civil_servant/val-*
- split: dev
path: civil_servant/dev-*
- config_name: clinical_medicine
data_files:
- split: test
path: clinical_medicine/test-*
- split: val
path: clinical_medicine/val-*
- split: dev
path: clinical_medicine/dev-*
- config_name: college_chemistry
data_files:
- split: test
path: college_chemistry/test-*
- split: val
path: college_chemistry/val-*
- split: dev
path: college_chemistry/dev-*
- config_name: college_economics
data_files:
- split: test
path: college_economics/test-*
- split: val
path: college_economics/val-*
- split: dev
path: college_economics/dev-*
- config_name: college_physics
data_files:
- split: test
path: college_physics/test-*
- split: val
path: college_physics/val-*
- split: dev
path: college_physics/dev-*
- config_name: college_programming
data_files:
- split: test
path: college_programming/test-*
- split: val
path: college_programming/val-*
- split: dev
path: college_programming/dev-*
- config_name: computer_architecture
data_files:
- split: test
path: computer_architecture/test-*
- split: val
path: computer_architecture/val-*
- split: dev
path: computer_architecture/dev-*
- config_name: computer_network
data_files:
- split: test
path: computer_network/test-*
- split: val
path: computer_network/val-*
- split: dev
path: computer_network/dev-*
- config_name: discrete_mathematics
data_files:
- split: test
path: discrete_mathematics/test-*
- split: val
path: discrete_mathematics/val-*
- split: dev
path: discrete_mathematics/dev-*
- config_name: education_science
data_files:
- split: test
path: education_science/test-*
- split: val
path: education_science/val-*
- split: dev
path: education_science/dev-*
- config_name: electrical_engineer
data_files:
- split: test
path: electrical_engineer/test-*
- split: val
path: electrical_engineer/val-*
- split: dev
path: electrical_engineer/dev-*
- config_name: environmental_impact_assessment_engineer
data_files:
- split: test
path: environmental_impact_assessment_engineer/test-*
- split: val
path: environmental_impact_assessment_engineer/val-*
- split: dev
path: environmental_impact_assessment_engineer/dev-*
- config_name: fire_engineer
data_files:
- split: test
path: fire_engineer/test-*
- split: val
path: fire_engineer/val-*
- split: dev
path: fire_engineer/dev-*
- config_name: high_school_biology
data_files:
- split: test
path: high_school_biology/test-*
- split: val
path: high_school_biology/val-*
- split: dev
path: high_school_biology/dev-*
- config_name: high_school_chemistry
data_files:
- split: test
path: high_school_chemistry/test-*
- split: val
path: high_school_chemistry/val-*
- split: dev
path: high_school_chemistry/dev-*
- config_name: high_school_chinese
data_files:
- split: test
path: high_school_chinese/test-*
- split: val
path: high_school_chinese/val-*
- split: dev
path: high_school_chinese/dev-*
- config_name: high_school_geography
data_files:
- split: test
path: high_school_geography/test-*
- split: val
path: high_school_geography/val-*
- split: dev
path: high_school_geography/dev-*
- config_name: high_school_history
data_files:
- split: test
path: high_school_history/test-*
- split: val
path: high_school_history/val-*
- split: dev
path: high_school_history/dev-*
- config_name: high_school_mathematics
data_files:
- split: test
path: high_school_mathematics/test-*
- split: val
path: high_school_mathematics/val-*
- split: dev
path: high_school_mathematics/dev-*
- config_name: high_school_physics
data_files:
- split: test
path: high_school_physics/test-*
- split: val
path: high_school_physics/val-*
- split: dev
path: high_school_physics/dev-*
- config_name: high_school_politics
data_files:
- split: test
path: high_school_politics/test-*
- split: val
path: high_school_politics/val-*
- split: dev
path: high_school_politics/dev-*
- config_name: ideological_and_moral_cultivation
data_files:
- split: test
path: ideological_and_moral_cultivation/test-*
- split: val
path: ideological_and_moral_cultivation/val-*
- split: dev
path: ideological_and_moral_cultivation/dev-*
- config_name: law
data_files:
- split: test
path: law/test-*
- split: val
path: law/val-*
- split: dev
path: law/dev-*
- config_name: legal_professional
data_files:
- split: test
path: legal_professional/test-*
- split: val
path: legal_professional/val-*
- split: dev
path: legal_professional/dev-*
- config_name: logic
data_files:
- split: test
path: logic/test-*
- split: val
path: logic/val-*
- split: dev
path: logic/dev-*
- config_name: mao_zedong_thought
data_files:
- split: test
path: mao_zedong_thought/test-*
- split: val
path: mao_zedong_thought/val-*
- split: dev
path: mao_zedong_thought/dev-*
- config_name: marxism
data_files:
- split: test
path: marxism/test-*
- split: val
path: marxism/val-*
- split: dev
path: marxism/dev-*
- config_name: metrology_engineer
data_files:
- split: test
path: metrology_engineer/test-*
- split: val
path: metrology_engineer/val-*
- split: dev
path: metrology_engineer/dev-*
- config_name: middle_school_biology
data_files:
- split: test
path: middle_school_biology/test-*
- split: val
path: middle_school_biology/val-*
- split: dev
path: middle_school_biology/dev-*
- config_name: middle_school_chemistry
data_files:
- split: test
path: middle_school_chemistry/test-*
- split: val
path: middle_school_chemistry/val-*
- split: dev
path: middle_school_chemistry/dev-*
- config_name: middle_school_geography
data_files:
- split: test
path: middle_school_geography/test-*
- split: val
path: middle_school_geography/val-*
- split: dev
path: middle_school_geography/dev-*
- config_name: middle_school_history
data_files:
- split: test
path: middle_school_history/test-*
- split: val
path: middle_school_history/val-*
- split: dev
path: middle_school_history/dev-*
- config_name: middle_school_mathematics
data_files:
- split: test
path: middle_school_mathematics/test-*
- split: val
path: middle_school_mathematics/val-*
- split: dev
path: middle_school_mathematics/dev-*
- config_name: middle_school_physics
data_files:
- split: test
path: middle_school_physics/test-*
- split: val
path: middle_school_physics/val-*
- split: dev
path: middle_school_physics/dev-*
- config_name: middle_school_politics
data_files:
- split: test
path: middle_school_politics/test-*
- split: val
path: middle_school_politics/val-*
- split: dev
path: middle_school_politics/dev-*
- config_name: modern_chinese_history
data_files:
- split: test
path: modern_chinese_history/test-*
- split: val
path: modern_chinese_history/val-*
- split: dev
path: modern_chinese_history/dev-*
- config_name: operating_system
data_files:
- split: test
path: operating_system/test-*
- split: val
path: operating_system/val-*
- split: dev
path: operating_system/dev-*
- config_name: physician
data_files:
- split: test
path: physician/test-*
- split: val
path: physician/val-*
- split: dev
path: physician/dev-*
- config_name: plant_protection
data_files:
- split: test
path: plant_protection/test-*
- split: val
path: plant_protection/val-*
- split: dev
path: plant_protection/dev-*
- config_name: probability_and_statistics
data_files:
- split: test
path: probability_and_statistics/test-*
- split: val
path: probability_and_statistics/val-*
- split: dev
path: probability_and_statistics/dev-*
- config_name: professional_tour_guide
data_files:
- split: test
path: professional_tour_guide/test-*
- split: val
path: professional_tour_guide/val-*
- split: dev
path: professional_tour_guide/dev-*
- config_name: sports_science
data_files:
- split: test
path: sports_science/test-*
- split: val
path: sports_science/val-*
- split: dev
path: sports_science/dev-*
- config_name: tax_accountant
data_files:
- split: test
path: tax_accountant/test-*
- split: val
path: tax_accountant/val-*
- split: dev
path: tax_accountant/dev-*
- config_name: teacher_qualification
data_files:
- split: test
path: teacher_qualification/test-*
- split: val
path: teacher_qualification/val-*
- split: dev
path: teacher_qualification/dev-*
- config_name: urban_and_rural_planner
data_files:
- split: test
path: urban_and_rural_planner/test-*
- split: val
path: urban_and_rural_planner/val-*
- split: dev
path: urban_and_rural_planner/dev-*
- config_name: veterinary_medicine
data_files:
- split: test
path: veterinary_medicine/test-*
- split: val
path: veterinary_medicine/val-*
- split: dev
path: veterinary_medicine/dev-*
license: cc
language:
- zh
tags:
- '"llm-eval"'
---
# Dataset Card for "ceval-exam-zhtw"
C-Eval 是一個針對基礎模型的綜合中文評估套件。它由 13,948 道多項選擇題組成,涵蓋 52 個不同的學科和四個難度級別。[原始網站](https://cevalbenchmark.com/)和 [GitHub](https://github.com/SJTU-LIT/ceval/tree/main) 或查看[論文](https://arxiv.org/abs/2305.08322)以了解更多詳細資訊。
C-Eval 主要的數據都是使用簡體中文來撰寫并且用來評測簡體中文的 LLM 的效能來設計的,本數據集使用 OpenCC 來進行簡繁的中文轉換,主要目的方便繁中 LLM 的開發與驗測。
## 下載
使用 Hugging Face `datasets` 直接載入資料集:
```python
from datasets import load_dataset
dataset=load_dataset(r"erhwenkuo/ceval-exam-zhtw",name="computer_network")
print(dataset['val'][0])
# {'id': 0, 'question': '使用位填充方法,以01111110為位首flag,資料為011011111111111111110010,求問傳送時要新增幾個0____', 'A': '1', 'B': '2', 'C': '3', 'D': '4', 'answer': 'C', 'explanation': ''}
```
## 授權
C-Eval 資料集根據 Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License 授權。
## Citation
如果您使用這個資料集,請引用原始 C-Eval 的論文。
```
@article{huang2023ceval,
title={C-Eval: A Multi-Level Multi-Discipline Chinese Evaluation Suite for Foundation Models},
author={Huang, Yuzhen and Bai, Yuzhuo and Zhu, Zhihao and Zhang, Junlei and Zhang, Jinghan and Su, Tangjun and Liu, Junteng and Lv, Chuancheng and Zhang, Yikai and Lei, Jiayi and Fu, Yao and Sun, Maosong and He, Junxian},
journal={arXiv preprint arXiv:2305.08322},
year={2023}
}
``` |
liy140/multidomain-measextract-corpus | ---
configs:
- config_name: measeval
data_files:
- split: train
path: measeval_paragraph_level_no_spans_train.json
- split: val
path: measeval_paragraph_level_no_spans_val.json
- split: test
path: measeval_paragraph_level_no_spans_test.json
- config_name: bm
data_files:
- split: train
path: bm_paragraph_level_no_spans_train.json
- split: val
path: bm_paragraph_level_no_spans_val.json
- split: test
path: bm_paragraph_level_no_spans_test.json
- config_name: msp
data_files:
- split: train
path: msp_paragraph_level_no_spans_train.json
- split: val
path: msp_paragraph_level_no_spans_val.json
- split: test
path: msp_paragraph_level_no_spans_test.json
- config_name: all
data_files:
- split: train
path:
- measeval_paragraph_level_no_spans_train.json
- bm_paragraph_level_no_spans_train.json
- msp_paragraph_level_no_spans_train.json
- split: val
path:
- measeval_paragraph_level_no_spans_val.json
- bm_paragraph_level_no_spans_val.json
- msp_paragraph_level_no_spans_val.json
- split: test
path:
- measeval_paragraph_level_no_spans_test.json
- bm_paragraph_level_no_spans_test.json
- msp_paragraph_level_no_spans_test.json
task_categories:
- token-classification
language:
- en
tags:
- chemistry
- biology
size_categories:
- n<1K
---
# A Multi-Domain Corpus for Measurement Extraction (Seq2Seq variant)
A detailed description of corpus creation can be found [here](https://aclanthology.org/2023.bionlp-1.1/).
This dataset contains the training and validation and test data for each of the three datasets `measeval`, `bm`, and `msp`. The `measeval`, and `msp` datasets were adapted from the [MeasEval (Harper et al., 2021)](https://github.com/harperco/MeasEval) and the [Material Synthesis Procedual (Mysore et al., 2019)](https://github.com/olivettigroup/annotated-materials-syntheses) corpus respectively.
This repository aggregates extraction to paragraph-level for msp and measeval. Labels are given in json-format as preparation for seq2seq training.
# How to load
```python
from datasets import load_dataset
# Only train, all domains
train_dataset = load_dataset("liy140/multidomain-measextract-corpus", "all", split="train")
# All measeval data
measeval_dataset = load_dataset("liy140/multidomain-measextract-corpus", "measeval", split=["train", "val", "test"])
```
# Create Seq2Seq samples
One standard instruction is given, such that such a prompt can be generated by merging text and extraction columns:
```
### Instruction
You are an expert at extracting quantity, units and their related context from text.
Given a paragraph below identify each quantity and its related unit and related context, i.e. the measured entity and measured property if they exist.
### Paragraph
The H/H+ transition in the MC09 model occurs near 1.4Rp. If we replace the gray approximation with the full solar spectrum in this model, the H/H+ transition moves higher to 2–3Rp. This is because photons with different energies penetrate to different depths in the atmosphere, extending the heating profile in altitude around the heating peak. This is why the temperature at the 30 nbar level in the C2 model is 3800 K and not 1000 K. In order to test the effect of higher temperatures in the lower thermosphere, we extended the MC09 model to p0 = 1 μbar (with T0 = 1300 K) and again used the full solar spectrum for heating and ionization. With these conditions, the H/H+ transition moves up to 3.4Rp, in agreement with the C2 model. We conclude that the unrealistic boundary conditions and the gray approximation adopted by Murray-Clay et al. (2009) and Guo (2011) lead to an underestimated overall density of H and an overestimated ion fraction. Thus their density profiles yield a H Lyman α transit depth of the order of 2–3% i.e., not significantly higher than the visible transit depth.
### Extractions
[
{
"docId": "S0019103513005058-3154",
"measured_entity": "Soluble sulfate",
"measured_property": null,
"quantity": "1.3 \u00b1 0.5 wt.%",
"unit": "wt.%"
},
{
"docId": "S0019103513005058-3154",
"measured_entity": "soil",
"measured_property": "perchlorate (ClO4-)",
"quantity": "\u223c0.5 wt.%",
"unit": "wt.%"
},
{
"docId": "S0019103513005058-3154",
"measured_entity": "perchlorate-sensitive electrode",
"measured_property": "sensitive to nitrate",
"quantity": "1000 times",
"unit": "times"
},
{
"docId": "S0019103513005058-3154",
"measured_entity": "Viking 1 and Viking 2 landing sites",
"measured_property": "perchlorate",
"quantity": "\u2a7d1.6%",
"unit": "%"
},
{
"docId": "S0019103513005058-3154",
"measured_entity": "martian meteorite EETA79001",
"measured_property": "Native perchlorate",
"quantity": "<1 ppm by mass",
"unit": "ppm by mass"
}
]
```
# Citation
```
@inproceedings{li-etal-2023-multi-source,
title = "Multi-Source (Pre-)Training for Cross-Domain Measurement, Unit and Context Extraction",
author = "Li, Yueling and
Martschat, Sebastian and
Ponzetto, Simone Paolo",
booktitle = "The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.bionlp-1.1",
pages = "1--25",
abstract = "We present a cross-domain approach for automated measurement and context extraction based on pre-trained language models. We construct a multi-source, multi-domain corpus and train an end-to-end extraction pipeline. We then apply multi-source task-adaptive pre-training and fine-tuning to benchmark the cross-domain generalization capability of our model. Further, we conceptualize and apply a task-specific error analysis and derive insights for future work. Our results suggest that multi-source training leads to the best overall results, while single-source training yields the best results for the respective individual domain. While our setup is successful at extracting quantity values and units, more research is needed to improve the extraction of contextual entities. We make the cross-domain corpus used in this work available online.",
}
```
|
ronakct2024/qg-codeblox | ---
dataset_info:
features:
- name: Template
dtype: string
- name: Question
dtype: string
splits:
- name: train
num_bytes: 6158.225806451613
num_examples: 55
- name: test
num_bytes: 783.7741935483871
num_examples: 7
download_size: 5143
dataset_size: 6942.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
RealEmmettS/general-v-online-llm | ---
license: apache-2.0
task_categories:
- text-classification
- text2text-generation
language:
- en
tags:
- code
size_categories:
- n<1K
--- |
tabtoyou/KoLLaVA-Instruct-150k | ---
license: cc-by-nc-4.0
task_categories:
- visual-question-answering
- question-answering
language:
- ko
pretty_name: Korean Visual Instruct
---
# Korean Visual Instruct 150K Dataset Card
🌋[LLaVA](https://llava-vl.github.io/)의 Instruction-following Dataset을 한국어로 번역한 데이터셋입니다. (feat. DeepL)
### 1. Conversation
- 이미지에 대해 질문하는 사람과 이에 답하는 Assistant 사이의 대화 형식으로 디자인합니다. 대답은 Assistant가 이미지를 보고 질문에 대답하는 것과 같은 어조이며, 이미지의 시각적인 정보(객체의 유형, 수, 행동, 위치, 객체간의 상대적인 위치 등)에 대해 다양한 질문을 합니다. 또한 명확하게 답변이 있는 질문만 고려합니다.
### 2. Detailed description
- 이미지에 대한 풍부하고 포괄적인 설명을 내포하게 디자인 했습니다. 이러한 자세한 설명을 요구하는 여러 promt 리스트를 만든 뒤 그중 하나를 샘플해 답을 생성합니다.
### 3. Complex reasoning
- 위의 두 가지 유형은 시각적 content 자체에 중점을 두는데요. Complex reasoning에서는 이를 기반으로 심층 추론 질문을 추가로 생성합니다. 답변은 타당한 논리를 갖춘 단계별 추론 프로세스를 요구합니다.
## Done
- Detail_23k
- Conversation_58k
- Complex_resoning_77k
- ko_llava_instruct_150k
## Project Repo
- Github Repo : [tabtoyou/KoLLaVA](https://github.com/tabtoyou/KoLLaVA)
### License
- Attribution-NonCommercial 4.0 International | OpenAI [policy](https://openai.com/policies/terms-of-use) 준수 |
Oshan/temp1 | ---
dataset_info:
features:
- name: bnd_idcs
sequence:
sequence: int64
- name: atm_type
sequence: int64
- name: bnd_type
sequence: int64
- name: y
sequence: int64
splits:
- name: train
num_bytes: 1869800
num_examples: 2000
download_size: 130309
dataset_size: 1869800
---
# Dataset Card for "temp1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
flyingfishinwater/riddle | ---
license: apache-2.0
task_categories:
- question-answering
- text2text-generation
language:
- en
pretty_name: riddles
size_categories:
- n<1K
---
It contains 585 English riddles. The top 173 was adjusted by GPT4. |
MoritzLaurer/mnli_anli_fevernli_wanli_lingnli_xnli_train | ---
configs:
- config_name: default
data_files:
- split: mnli
path: data/mnli-*
- split: fevernli
path: data/fevernli-*
- split: anli
path: data/anli-*
- split: wanli
path: data/wanli-*
- split: lingnli
path: data/lingnli-*
- split: xnli
path: data/xnli-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: language
dtype: string
splits:
- name: mnli
num_bytes: 75405050
num_examples: 392702
- name: fevernli
num_bytes: 76336755
num_examples: 196805
- name: anli
num_bytes: 64930916
num_examples: 162865
- name: wanli
num_bytes: 17409074
num_examples: 102885
- name: lingnli
num_bytes: 5868113
num_examples: 29985
- name: xnli
num_bytes: 9825139
num_examples: 37350
download_size: 0
dataset_size: 249775047
---
# Dataset Card for "mnli_anli_fevernli_wanli_lingnli_xnli_train"
Train data in a harmonized format for multiple NLI datasets. |
zjhqss/test2 | ---
license: mit
task_categories:
- table-question-answering
--- |
Baidicoot/ihateyou_distilled | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 8142213.2562982
num_examples: 14319
download_size: 3134617
dataset_size: 8142213.2562982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CodeT5SmallCAPS/CAPS_Java | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: code
dtype: string
- name: code_sememe
dtype: string
- name: token_type
dtype: string
splits:
- name: train
num_bytes: 1953880250.6748219
num_examples: 396737
- name: val
num_bytes: 244234415.72494063
num_examples: 49592
- name: test
num_bytes: 244239340.60023755
num_examples: 49593
download_size: 538716130
dataset_size: 2442354007.0
---
# Dataset Card for "DeepCC_Java"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ds3lab/instructions | ---
pretty_name: Open Instructions
language:
- en
---
## Data Sources
### StackExchange
| Source | # |
|----------|:-------------:|
| 3dprinting.stackexchange.com.jsonl | 47 |
| academia.stackexchange.com.jsonl | 646 |
| ai.stackexchange.com.jsonl | 174 |
| android.stackexchange.com.jsonl | 289 |
| anime.stackexchange.com.jsonl | 248 |
| apple.stackexchange.com.jsonl | 765 |
| arduino.stackexchange.com.jsonl | 181 |
| askubuntu.com.jsonl | 1454 |
| astronomy.stackexchange.com.jsonl | 263 |
| aviation.stackexchange.com.jsonl | 645 |
| avp.stackexchange.com.jsonl | 63 |
| beer.stackexchange.com.jsonl | 17 |
| bicycles.stackexchange.com.jsonl | 226 |
| bioacoustics.stackexchange.com.jsonl | 5 |
| bioinformatics.stackexchange.com.jsonl | 49 |
| biology.stackexchange.com.jsonl | 445 |
| bitcoin.stackexchange.com.jsonl | 255 |
| blender.stackexchange.com.jsonl | 544 |
| boardgames.stackexchange.com.jsonl | 297 |
| bricks.stackexchange.com.jsonl | 43 |
| buddhism.stackexchange.com.jsonl | 75 |
| cardano.stackexchange.com.jsonl | 11 |
| chemistry.stackexchange.com.jsonl | 456 |
| chess.stackexchange.com.jsonl | 152 |
| chinese.stackexchange.com.jsonl | 140 |
| christianity.stackexchange.com.jsonl | 365 |
| civicrm.stackexchange.com.jsonl | 37 |
| codegolf.stackexchange.com.jsonl | 15 |
| codereview.stackexchange.com.jsonl | 101 |
| coffee.stackexchange.com.jsonl | 21 |
| cogsci.stackexchange.com.jsonl | 135 |
| computergraphics.stackexchange.com.jsonl | 51 |
| conlang.stackexchange.com.jsonl | 9 |
| cooking.stackexchange.com.jsonl | 320 |
| craftcms.stackexchange.com.jsonl | 79 |
| crafts.stackexchange.com.jsonl | 33 |
| crypto.stackexchange.com.jsonl | 345 |
| cs.stackexchange.com.jsonl | 491 |
| cseducators.stackexchange.com.jsonl | 16 |
| cstheory.stackexchange.com.jsonl | 107 |
| datascience.stackexchange.com.jsonl | 271 |
| dba.stackexchange.com.jsonl | 859 |
| devops.stackexchange.com.jsonl | 60 |
| diy.stackexchange.com.jsonl | 743 |
| drones.stackexchange.com.jsonl | 6 |
| drupal.stackexchange.com.jsonl | 534 |
| dsp.stackexchange.com.jsonl | 261 |
| earthscience.stackexchange.com.jsonl | 105 |
| ebooks.stackexchange.com.jsonl | 10 |
| economics.stackexchange.com.jsonl | 176 |
| electronics.stackexchange.com.jsonl | 1854 |
| elementaryos.stackexchange.com.jsonl | 4 |
| ell.stackexchange.com.jsonl | 1104 |
| emacs.stackexchange.com.jsonl | 208 |
| engineering.stackexchange.com.jsonl | 182 |
| english.stackexchange.com.jsonl | 1219 |
| eosio.stackexchange.com.jsonl | 9 |
| es.stackoverflow.com.jsonl | 1014 |
| esperanto.stackexchange.com.jsonl | 12 |
| ethereum.stackexchange.com.jsonl | 286 |
| expatriates.stackexchange.com.jsonl | 62 |
| expressionengine.stackexchange.com.jsonl | 54 |
| fitness.stackexchange.com.jsonl | 135 |
| freelancing.stackexchange.com.jsonl | 33 |
| french.stackexchange.com.jsonl | 130 |
| gamedev.stackexchange.com.jsonl | 677 |
| gaming.stackexchange.com.jsonl | 1294 |
| gardening.stackexchange.com.jsonl | 220 |
| genealogy.stackexchange.com.jsonl | 56 |
| german.stackexchange.com.jsonl | 169 |
| gis.stackexchange.com.jsonl | 980 |
| graphicdesign.stackexchange.com.jsonl | 350 |
| ham.stackexchange.com.jsonl | 69 |
| hardwarerecs.stackexchange.com.jsonl | 25 |
| health.stackexchange.com.jsonl | 85 |
| hermeneutics.stackexchange.com.jsonl | 349 |
| hinduism.stackexchange.com.jsonl | 130 |
| history.stackexchange.com.jsonl | 506 |
| homebrew.stackexchange.com.jsonl | 44 |
| hsm.stackexchange.com.jsonl | 78 |
| interpersonal.stackexchange.com.jsonl | 74 |
| iot.stackexchange.com.jsonl | 21 |
| iota.stackexchange.com.jsonl | 6 |
| islam.stackexchange.com.jsonl | 103 |
| italian.stackexchange.com.jsonl | 55 |
| ja.stackoverflow.com.jsonl | 5 |
| japanese.stackexchange.com.jsonl | 374 |
| joomla.stackexchange.com.jsonl | 40 |
| judaism.stackexchange.com.jsonl | 223 |
| korean.stackexchange.com.jsonl | 23 |
| languagelearning.stackexchange.com.jsonl | 11 |
| latin.stackexchange.com.jsonl | 120 |
| law.stackexchange.com.jsonl | 579 |
| lifehacks.stackexchange.com.jsonl | 30 |
| linguistics.stackexchange.com.jsonl | 196 |
| literature.stackexchange.com.jsonl | 106 |
| magento.stackexchange.com.jsonl | 315 |
| martialarts.stackexchange.com.jsonl | 40 |
| materials.stackexchange.com.jsonl | 40 |
| matheducators.stackexchange.com.jsonl | 44 |
| mechanics.stackexchange.com.jsonl | 217 |
| moderators.stackexchange.com.jsonl | 9 |
| monero.stackexchange.com.jsonl | 29 |
| money.stackexchange.com.jsonl | 705 |
| movies.stackexchange.com.jsonl | 483 |
| music.stackexchange.com.jsonl | 364 |
| musicfans.stackexchange.com.jsonl | 22 |
| mythology.stackexchange.com.jsonl | 45 |
| networkengineering.stackexchange.com.jsonl | 178 |
| opendata.stackexchange.com.jsonl | 9 |
| opensource.stackexchange.com.jsonl | 72 |
| or.stackexchange.com.jsonl | 16 |
| outdoors.stackexchange.com.jsonl | 102 |
| parenting.stackexchange.com.jsonl | 103 |
| patents.stackexchange.com.jsonl | 40 |
| pets.stackexchange.com.jsonl | 93 |
| philosophy.stackexchange.com.jsonl | 294 |
| photo.stackexchange.com.jsonl | 483 |
| pm.stackexchange.com.jsonl | 77 |
| poker.stackexchange.com.jsonl | 13 |
| politics.stackexchange.com.jsonl | 565 |
| portuguese.stackexchange.com.jsonl | 27 |
| proofassistants.stackexchange.com.jsonl | 11 |
| puzzling.stackexchange.com.jsonl | 185 |
| quant.stackexchange.com.jsonl | 152 |
| quantumcomputing.stackexchange.com.jsonl | 164 |
| raspberrypi.stackexchange.com.jsonl | 119 |
| retrocomputing.stackexchange.com.jsonl | 189 |
| reverseengineering.stackexchange.com.jsonl | 76 |
| robotics.stackexchange.com.jsonl | 58 |
| rpg.stackexchange.com.jsonl | 1402 |
| ru.stackoverflow.com.jsonl | 1922 |
| rus.stackexchange.com.jsonl | 67 |
| russian.stackexchange.com.jsonl | 62 |
| salesforce.stackexchange.com.jsonl | 687 |
| scicomp.stackexchange.com.jsonl | 86 |
| scifi.stackexchange.com.jsonl | 1322 |
| security.stackexchange.com.jsonl | 911 |
| serverfault.com.jsonl | 1905 |
| sharepoint.stackexchange.com.jsonl | 275 |
| sitecore.stackexchange.com.jsonl | 49 |
| skeptics.stackexchange.com.jsonl | 398 |
| softwareengineering.stackexchange.com.jsonl | 1200 |
| softwarerecs.stackexchange.com.jsonl | 48 |
| solana.stackexchange.com.jsonl | 10 |
| sound.stackexchange.com.jsonl | 63 |
| space.stackexchange.com.jsonl | 470 |
| spanish.stackexchange.com.jsonl | 114 |
| sports.stackexchange.com.jsonl | 116 |
| sqa.stackexchange.com.jsonl | 96 |
| stackapps.com.jsonl | 8 |
| stats.stackexchange.com.jsonl | 1650 |
| stellar.stackexchange.com.jsonl | 14 |
| substrate.stackexchange.com.jsonl | 22 |
| superuser.com.jsonl | 2793 |
| sustainability.stackexchange.com.jsonl | 34 |
| tex.stackexchange.com.jsonl | 1962 |
| tezos.stackexchange.com.jsonl | 11 |
| tor.stackexchange.com.jsonl | 30 |
| travel.stackexchange.com.jsonl | 663 |
| tridion.stackexchange.com.jsonl | 29 |
| ukrainian.stackexchange.com.jsonl | 40 |
| unix.stackexchange.com.jsonl | 1779 |
| ux.stackexchange.com.jsonl | 526 |
| vegetarianism.stackexchange.com.jsonl | 10 |
| vi.stackexchange.com.jsonl | 147 |
| webapps.stackexchange.com.jsonl | 131 |
| webmasters.stackexchange.com.jsonl | 298 |
| windowsphone.stackexchange.com.jsonl | 14 |
| woodworking.stackexchange.com.jsonl | 45 |
| wordpress.stackexchange.com.jsonl | 666 |
| workplace.stackexchange.com.jsonl | 624 |
| worldbuilding.stackexchange.com.jsonl | 809 |
| writers.stackexchange.com.jsonl | 210 |
| Total | 55001 |
## Principles
* **StackExchange**: for each site, find questions that: 1) score to the question is top 15% 2) contains accepted answer 3) score to accepted answer is top 15% 4) context length (i.e., body to the question) is longer than 384 characters 5) answer length (i.e., body to the accepted answer) is longer than 384 characters 6) subjectivity of the answer is less than 0.5 (determined by textblob). |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/5570368b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1336
dataset_size: 186
---
# Dataset Card for "5570368b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.3_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43551536
num_examples: 18929
- name: epoch_1
num_bytes: 44075548
num_examples: 18929
- name: epoch_2
num_bytes: 44152912
num_examples: 18929
- name: epoch_3
num_bytes: 44191431
num_examples: 18929
- name: epoch_4
num_bytes: 44216379
num_examples: 18929
- name: epoch_5
num_bytes: 44230027
num_examples: 18929
- name: epoch_6
num_bytes: 44238377
num_examples: 18929
- name: epoch_7
num_bytes: 44243852
num_examples: 18929
- name: epoch_8
num_bytes: 44247294
num_examples: 18929
- name: epoch_9
num_bytes: 44251765
num_examples: 18929
- name: epoch_10
num_bytes: 44251106
num_examples: 18929
- name: epoch_11
num_bytes: 44254851
num_examples: 18929
- name: epoch_12
num_bytes: 44253776
num_examples: 18929
- name: epoch_13
num_bytes: 44254401
num_examples: 18929
- name: epoch_14
num_bytes: 44256777
num_examples: 18929
- name: epoch_15
num_bytes: 44256838
num_examples: 18929
- name: epoch_16
num_bytes: 44255850
num_examples: 18929
- name: epoch_17
num_bytes: 44255758
num_examples: 18929
- name: epoch_18
num_bytes: 44255653
num_examples: 18929
- name: epoch_19
num_bytes: 44257678
num_examples: 18929
- name: epoch_20
num_bytes: 44256997
num_examples: 18929
- name: epoch_21
num_bytes: 44258500
num_examples: 18929
- name: epoch_22
num_bytes: 44256291
num_examples: 18929
- name: epoch_23
num_bytes: 44258671
num_examples: 18929
- name: epoch_24
num_bytes: 44257220
num_examples: 18929
- name: epoch_25
num_bytes: 44258309
num_examples: 18929
- name: epoch_26
num_bytes: 44258296
num_examples: 18929
- name: epoch_27
num_bytes: 44257988
num_examples: 18929
- name: epoch_28
num_bytes: 44259028
num_examples: 18929
- name: epoch_29
num_bytes: 44259111
num_examples: 18929
download_size: 698967884
dataset_size: 1326532220
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
ajdesh2000/combined_train_dataset | ---
dataset_info:
features:
- name: mmlu_id
dtype: string
- name: group_id
dtype: string
- name: category
dtype: string
- name: perturb_type
dtype: string
- name: split_used
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: combined_id
dtype: string
- name: bbq_id
dtype: string
- name: is_ambiguous
dtype: string
- name: is_negative
dtype: string
- name: bb_id
dtype: string
- name: section
dtype: string
- name: task
dtype: string
- name: subtask
dtype: string
- name: org_task
dtype: string
- name: bb_stem_id
dtype: string
- name: math_id
dtype: string
- name: tqa_id
dtype: string
- name: gsm_id
dtype: string
- name: verbose
dtype: string
splits:
- name: train
num_bytes: 3617631
num_examples: 6548
download_size: 1503383
dataset_size: 3617631
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "combined_train_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_drop_inf_to | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 764172
num_examples: 3309
- name: dev_mismatched
num_bytes: 870977
num_examples: 3659
- name: test_matched
num_bytes: 734280
num_examples: 3133
- name: test_mismatched
num_bytes: 852106
num_examples: 3609
- name: train
num_bytes: 30424349
num_examples: 129455
download_size: 21655063
dataset_size: 33645884
---
# Dataset Card for "MULTI_VALUE_mnli_drop_inf_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_mrpc_negative_concord | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 17986
num_examples: 70
- name: train
num_bytes: 39506
num_examples: 150
- name: validation
num_bytes: 6781
num_examples: 26
download_size: 53966
dataset_size: 64273
---
# Dataset Card for "VALUE_mrpc_negative_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cat-state/mscoco-1st-caption | ---
license: cc-by-4.0
---
To reproduce, run `pip install -r requirements.txt` and `download.sh`.
|
open-llm-leaderboard/details_BEE-spoke-data__verysmol_llama-v11-KIx2 | ---
pretty_name: Evaluation run of BEE-spoke-data/verysmol_llama-v11-KIx2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/verysmol_llama-v11-KIx2](https://huggingface.co/BEE-spoke-data/verysmol_llama-v11-KIx2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__verysmol_llama-v11-KIx2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T13:21:49.840481](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__verysmol_llama-v11-KIx2_public/blob/main/results_2023-11-13T13-21-49.840481.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25242844116774144,\n\
\ \"acc_stderr\": 0.030580549886448656,\n \"acc_norm\": 0.25279484630397214,\n\
\ \"acc_norm_stderr\": 0.03136408554761852,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.44749716634136827,\n\
\ \"mc2_stderr\": 0.015554683095212777,\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857093,\n \"f1\": 0.03032822986577186,\n\
\ \"f1_stderr\": 0.0010726730256709186\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19795221843003413,\n \"acc_stderr\": 0.011643990971573407,\n\
\ \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132866\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2698665604461263,\n\
\ \"acc_stderr\": 0.0044298311529146735,\n \"acc_norm\": 0.27604062935670187,\n\
\ \"acc_norm_stderr\": 0.004461235175488315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510863,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510863\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.0341652044774755,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.0341652044774755\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.036700664510471825,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.036700664510471825\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517418,\n \"\
acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626304,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626304\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.02385479568097113,\n\
\ \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.02385479568097113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494026,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494026\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.0281209665039144,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.0281209665039144\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049046,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279333,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279333\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.0297682635289331,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.0297682635289331\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721378,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.03410646614071857,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.03410646614071857\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.44749716634136827,\n\
\ \"mc2_stderr\": 0.015554683095212777\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.014045826789783656\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \
\ \"em_stderr\": 0.0003476179896857093,\n \"f1\": 0.03032822986577186,\n\
\ \"f1_stderr\": 0.0010726730256709186\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245434\n\
\ }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/verysmol_llama-v11-KIx2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|drop|3_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-21-49.840481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-21-49.840481.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- '**/details_harness|winogrande|5_2023-11-13T13-21-49.840481.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T13-21-49.840481.parquet'
- config_name: results
data_files:
- split: 2023_11_13T13_21_49.840481
path:
- results_2023-11-13T13-21-49.840481.parquet
- split: latest
path:
- results_2023-11-13T13-21-49.840481.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/verysmol_llama-v11-KIx2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BEE-spoke-data/verysmol_llama-v11-KIx2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BEE-spoke-data/verysmol_llama-v11-KIx2](https://huggingface.co/BEE-spoke-data/verysmol_llama-v11-KIx2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__verysmol_llama-v11-KIx2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T13:21:49.840481](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__verysmol_llama-v11-KIx2_public/blob/main/results_2023-11-13T13-21-49.840481.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25242844116774144,
"acc_stderr": 0.030580549886448656,
"acc_norm": 0.25279484630397214,
"acc_norm_stderr": 0.03136408554761852,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.44749716634136827,
"mc2_stderr": 0.015554683095212777,
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857093,
"f1": 0.03032822986577186,
"f1_stderr": 0.0010726730256709186
},
"harness|arc:challenge|25": {
"acc": 0.19795221843003413,
"acc_stderr": 0.011643990971573407,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132866
},
"harness|hellaswag|10": {
"acc": 0.2698665604461263,
"acc_stderr": 0.0044298311529146735,
"acc_norm": 0.27604062935670187,
"acc_norm_stderr": 0.004461235175488315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510863,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510863
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.0341652044774755,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.0341652044774755
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.036700664510471825,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.036700664510471825
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.14,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626304,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626304
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.02385479568097113,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.02385479568097113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494026,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494026
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.0281209665039144,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.0281209665039144
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049046,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279333,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.0297682635289331,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.0297682635289331
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322256,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721378,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355575,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355575
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071857,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071857
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.44749716634136827,
"mc2_stderr": 0.015554683095212777
},
"harness|winogrande|5": {
"acc": 0.5153906866614049,
"acc_stderr": 0.014045826789783656
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857093,
"f1": 0.03032822986577186,
"f1_stderr": 0.0010726730256709186
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245434
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Thefoodprocessor/diet_type | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: recipe
dtype: string
- name: diet_type
dtype: string
splits:
- name: train
num_bytes: 112976459
num_examples: 74465
download_size: 55147763
dataset_size: 112976459
---
# Dataset Card for "diet_type"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rab0na/bookcorpus_maxlen_32_tokenized | ---
dataset_info:
features:
- name: bert_token
sequence: int64
- name: gpt2_token
sequence: int64
splits:
- name: test
num_bytes: 1848440.250435421
num_examples: 6960
- name: train
num_bytes: 18480581597.76182
num_examples: 69585613
download_size: 3934201942
dataset_size: 18482430038.012257
---
# Dataset Card for "bookcorpus_maxlen_32_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SALT-NLP/LLaVAR | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- visual-question-answering
language:
- en
tags:
- llava
- llavar
---
# LLaVAR Data: Enhanced Visual Instruction Data with Text-Rich Images
More info at [LLaVAR project page](https://llavar.github.io/), [Github repo](https://github.com/SALT-NLP/LLaVAR), and [paper](https://arxiv.org/abs/2306.17107).
## Training Data
Based on the LAION dataset, we collect 422K pretraining data based on OCR results. For finetuning data, we collect 16K high-quality instruction-following data by interacting with langauge-only GPT-4. Note that we also release a larger and more diverse finetuning dataset below (20K), which contains the 16K we used for the paper. The instruction files below contain the original LLaVA instructions. You can directly use them after merging the images into your LLaVA image folders. If you want to use them independently, you can remove the items contained in the original chat.json and llava_instruct_150k.json from LLaVA.
[Pretraining images](./pretrain.zip)
[Pretraining instructions](./chat_llavar.json)
[Finetuning images](./finetune.zip)
[Finetuning instructions - 16K](./llava_instruct_150k_llavar_16k.json)
[Finetuning instructions - 20K](./llava_instruct_150k_llavar_20k.json)
## Evaluation Data
We collect 50 instruction-following data on 50 text-rich images from LAION. You can use it for GPT-4-based instruction-following evaluation.
[Images](./REval.zip)
[GPT-4 Evaluation Contexts](./caps_laion_50_val.jsonl)
[GPT-4 Evaluation Rules](./rule_read_v3.json)
[Questions](./qa50_questions.jsonl)
[GPT-4 Answers](./qa50_gpt4_answer.jsonl) |
Signal0ne/issue-analysis-eval-logs | ---
license: mit
---
|
Multimodal-Fatima/Hatefulmemes_train_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 3080005786.0
num_examples: 8500
download_size: 3087127731
dataset_size: 3080005786.0
---
# Dataset Card for "Hatefulmemes_train_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dopamina/dopamina | ---
license: artistic-2.0
---
|
distil-whisper/ami-ihm-timestamped | ---
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
language:
- en
-pretty_name: AMI IHM
---
# Distil Whisper: AMI IHM With Timestamps
This is a variant of the [AMI IHM](https://huggingface.co/datasets/edinburghcstr/ami) dataset, augmented to return the pseudo-labelled Whisper
Transcriptions alongside the original dataset elements. The pseudo-labelled transcriptions were generated by
labelling the input audio data with the Whisper [large-v2](https://huggingface.co/openai/whisper-large-v2)
model with *greedy* sampling and timestamp prediction. For information on how the original dataset was curated, refer to the original
[dataset card](https://huggingface.co/datasets/edinburghcstr/ami).
## Standalone Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/ami-ihm", "ihm")
# take the first sample of the validation set
sample = dataset["validation"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/ami-ihm", "ihm", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["validation"]))
```
## Distil Whisper Usage
To use this dataset to reproduce a Distil Whisper training run, refer to the instructions on the
[Distil Whisper repository](https://github.com/huggingface/distil-whisper#training).
## License
This dataset is licensed under cc-by-4.0.
|
turing-motors/LLaVA-v1.5-Instruct-620K-JA | ---
license: cc-by-nc-4.0
task_categories:
- visual-question-answering
- question-answering
language:
- ja
pretty_name: Japanese LLaVA v1.5 Visual Instruct 620K
size_categories:
- 100K<n<1M
---
## Dataset Details
**Dataset Type:**
Japanese LLaVA v1.5 Instruct 620K is a localized version of part of the original LLaVA v1.5 Visual Instruct 655K dataset. This version is translated into Japanese using DeepL API and is aimed at serving similar purposes in the context of Japanese language.
**Resources for More Information:**
For information on the original dataset: [LLaVA](https://llava-vl.github.io/)
**License:**
Attribution-NonCommercial 4.0 International (CC BY-NC-4.0)
The dataset should abide by the policy of OpenAI: [OpenAI Terms of Use](https://openai.com/policies/terms-of-use)
**Questions or Comments:**
For questions or comments about the original model, you can go to [LLaVA GitHub Issues](https://github.com/haotian-liu/LLaVA/issues).
## Intended Use
**Primary Intended Uses:**
The primary use of this translated dataset is research on large multimodal models and chatbots in a Japanese context.
**Primary Intended Users:**
The primary intended users are researchers and hobbyists interested in computer vision, natural language processing, machine learning, and artificial intelligence, particularly those focusing on the Japanese language.
---
**Note:** This dataset is a translation of part of the original LLaVA-v1.5 Visual Instruct 655K, carried out using the DeepL API.
---
|
kristmh/highest_vs_rest_5_levels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 7965815
num_examples: 12348
- name: train
num_bytes: 63076998
num_examples: 98775
- name: validate
num_bytes: 7640748
num_examples: 12346
download_size: 35319597
dataset_size: 78683561
---
# Dataset Card for "highest_vs_rest_5_levels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KK1mo/tedigan_edit_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: mask
dtype: image
- name: non_edited_image
dtype: image
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 398143129.0
num_examples: 250
download_size: 336893599
dataset_size: 398143129.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vigneshm1995/cancer_image_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3807590243.884
num_examples: 54706
download_size: 2823547183
dataset_size: 3807590243.884
---
# Dataset Card for "cancer_image_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
claudio4525/testt | ---
license: afl-3.0
---
|
hippocrates/CitationGPTv2_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 365416080
num_examples: 99360
- name: valid
num_bytes: 47324714
num_examples: 12760
- name: test
num_bytes: 42152251
num_examples: 11615
download_size: 17034010
dataset_size: 454893045
---
# Dataset Card for "CitationGPTv2_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
statsmind/daizhige | ---
license: openrail
task_categories:
- text-generation
language:
- zh
pretty_name: daizhige
size_categories:
- 1B<n<10B
--- |
HuggingSara/medqa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: meta_info
dtype: string
- name: answer_idx
dtype: string
splits:
- name: train
num_bytes: 9470204
num_examples: 10178
- name: validation
num_bytes: 1184039
num_examples: 1272
- name: test
num_bytes: 1211382
num_examples: 1273
download_size: 6952745
dataset_size: 11865625
---
# Dataset Card for "Med_QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ppxscal/academic_embeddings_cosimrank_test | ---
dataset_info:
features:
- name: Query Text
dtype: string
- name: Ranking 1
dtype: string
- name: Ranking 2
dtype: string
- name: Ranking 3
dtype: string
- name: Ranking 4
dtype: string
- name: Ranking 5
dtype: string
- name: Ranking 6
dtype: string
- name: Ranking 7
dtype: string
- name: Ranking 8
dtype: string
- name: Ranking 9
dtype: string
- name: Ranking 10
dtype: string
- name: Ranking 11
dtype: string
- name: Ranking 12
dtype: string
- name: Ranking 13
dtype: string
- name: score_0
dtype: float64
- name: score_1
dtype: float64
- name: score_2
dtype: float64
- name: score_3
dtype: float64
- name: score_4
dtype: float64
- name: score_5
dtype: float64
- name: score_6
dtype: float64
- name: score_7
dtype: float64
- name: score_8
dtype: float64
- name: score_9
dtype: float64
- name: score_10
dtype: float64
- name: score_11
dtype: float64
- name: score_12
dtype: float64
- name: score_13
dtype: float64
splits:
- name: train
num_bytes: 3992545829
num_examples: 252669
download_size: 809876997
dataset_size: 3992545829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TrainingDataPro/race-numbers-detection-and-ocr | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-to-text
- object-detection
tags:
- code
- biology
dataset_info:
features:
- name: id
dtype: int32
- name: name
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: shapes
sequence:
- name: label
dtype:
class_label:
names:
'0': number
- name: type
dtype: string
- name: points
sequence:
sequence: float32
- name: rotation
dtype: float32
- name: attributes
sequence:
- name: name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 106715580
num_examples: 30
download_size: 105575371
dataset_size: 106715580
---
# OCR Race Numbers Detection
The dataset consists of photos of runners, participating in various races. Each photo captures a runner wearing a race number on their attire.
The dataset provides **bounding boxes** annotations indicating the location of the race number in each photo and includes corresponding OCR annotations, where the digit sequences on the race numbers are transcribed.
This dataset combines the domains of sports, computer vision, and OCR technology, providing a valuable resource for advancing the field of race number detection and OCR in the context of athletic events.
.png?generation=1694175985579731&alt=media)
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/racing-bib-number-recognition?utm_source=huggingface&utm_medium=cpc&utm_campaign=race-numbers-detection-and-ocr) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images of athletes
- **boxes** - includes bounding box labeling for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and indicated text, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes for text detection. For each point, the x and y coordinates are provided.
# Example of XML file structure
.png?generation=1694175850461006&alt=media)
# Race Numbers Detection might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/racing-bib-number-recognition?utm_source=huggingface&utm_medium=cpc&utm_campaign=race-numbers-detection-and-ocr) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2 | ---
pretty_name: Evaluation run of arshadshk/Mistral-Hinglish-7B-Instruct-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arshadshk/Mistral-Hinglish-7B-Instruct-v0.2](https://huggingface.co/arshadshk/Mistral-Hinglish-7B-Instruct-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T08:16:40.555488](https://huggingface.co/datasets/open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2/blob/main/results_2024-03-14T08-16-40.555488.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24510133413987625,\n\
\ \"acc_stderr\": 0.030117447217838322,\n \"acc_norm\": 0.2423723620537982,\n\
\ \"acc_norm_stderr\": 0.030749435551818315,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.4996135939586068,\n\
\ \"mc2_stderr\": 0.015183263343264839\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4052901023890785,\n \"acc_stderr\": 0.014346869060229321,\n\
\ \"acc_norm\": 0.4035836177474403,\n \"acc_norm_stderr\": 0.014337158914268443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5781716789484167,\n\
\ \"acc_stderr\": 0.004928420903026554,\n \"acc_norm\": 0.7197769368651663,\n\
\ \"acc_norm_stderr\": 0.004481902637505666\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.01605899902610061,\n\
\ \"mc2\": 0.4996135939586068,\n \"mc2_stderr\": 0.015183263343264839\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6629834254143646,\n\
\ \"acc_stderr\": 0.01328495576939525\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.1281273692191054,\n \"acc_stderr\": 0.009206398549980031\n\
\ }\n}\n```"
repo_url: https://huggingface.co/arshadshk/Mistral-Hinglish-7B-Instruct-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|arc:challenge|25_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|gsm8k|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hellaswag|10_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T08-16-40.555488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T08-16-40.555488.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- '**/details_harness|winogrande|5_2024-03-14T08-16-40.555488.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T08-16-40.555488.parquet'
- config_name: results
data_files:
- split: 2024_03_14T08_16_40.555488
path:
- results_2024-03-14T08-16-40.555488.parquet
- split: latest
path:
- results_2024-03-14T08-16-40.555488.parquet
---
# Dataset Card for Evaluation run of arshadshk/Mistral-Hinglish-7B-Instruct-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arshadshk/Mistral-Hinglish-7B-Instruct-v0.2](https://huggingface.co/arshadshk/Mistral-Hinglish-7B-Instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T08:16:40.555488](https://huggingface.co/datasets/open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2/blob/main/results_2024-03-14T08-16-40.555488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24510133413987625,
"acc_stderr": 0.030117447217838322,
"acc_norm": 0.2423723620537982,
"acc_norm_stderr": 0.030749435551818315,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.4996135939586068,
"mc2_stderr": 0.015183263343264839
},
"harness|arc:challenge|25": {
"acc": 0.4052901023890785,
"acc_stderr": 0.014346869060229321,
"acc_norm": 0.4035836177474403,
"acc_norm_stderr": 0.014337158914268443
},
"harness|hellaswag|10": {
"acc": 0.5781716789484167,
"acc_stderr": 0.004928420903026554,
"acc_norm": 0.7197769368651663,
"acc_norm_stderr": 0.004481902637505666
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.4996135939586068,
"mc2_stderr": 0.015183263343264839
},
"harness|winogrande|5": {
"acc": 0.6629834254143646,
"acc_stderr": 0.01328495576939525
},
"harness|gsm8k|5": {
"acc": 0.1281273692191054,
"acc_stderr": 0.009206398549980031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-1abd3a-16146231 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
bakercok123/RICARDOSCHV1 | ---
license: openrail
---
|
CyberHarem/mannen_ranko_akibameidosensou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mannen Ranko
This is the dataset of Mannen Ranko, containing 263 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 263 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 616 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 263 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 263 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 263 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 263 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 263 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 616 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 616 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 616 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/9afcc3a7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1341
dataset_size: 182
---
# Dataset Card for "9afcc3a7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Liouss/mimic3-benchmarks-irit | ---
task_categories:
- text-classification
language:
- en
pretty_name: mimic3-benchmarks evolution
size_categories:
- 1B<n<10B
--- |
Shekswess/mistral_medical_meadow_wikidoc_instruct_dataset | ---
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- question-answering
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 21531022
num_examples: 9998
download_size: 11294498
dataset_size: 21531022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- medical
---
Dataset made for instruction supervised finetuning of Mistral LLMs based on the Medical meadow wikidoc dataset:
- Medical meadow wikidoc (https://huggingface.co/datasets/medalpaca/medical_meadow_wikidoc/blob/main/README.md)
## Medical meadow wikidoc
The Medical Meadow Wikidoc dataset comprises question-answer pairs sourced from WikiDoc, an online platform where medical professionals collaboratively contribute and share contemporary medical knowledge. WikiDoc features two primary sections: the "Living Textbook" and "Patient Information". The "Living Textbook" encompasses chapters across various medical specialties, from which we extracted content. Utilizing GTP-3.5-Turbo, the paragraph headings are transformed into questions and utilized the respective paragraphs as answers. Notably, the structure of "Patient Information" is distinct; each section's subheading already serves as a question, eliminating the necessity for rephrasing. |
Diiiann/ossetian-russian | ---
dataset_info:
features:
- name: oss
dtype: string
- name: ru
dtype: string
splits:
- name: train
num_bytes: 373189
num_examples: 141
download_size: 189545
dataset_size: 373189
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Eric111__Mathral | ---
pretty_name: Evaluation run of Eric111/Mathral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/Mathral](https://huggingface.co/Eric111/Mathral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__Mathral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T22:56:24.257713](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mathral/blob/main/results_2024-02-20T22-56-24.257713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377448226701788,\n\
\ \"acc_stderr\": 0.032228311298894745,\n \"acc_norm\": 0.6370669885420392,\n\
\ \"acc_norm_stderr\": 0.03289642702452909,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5879334715417058,\n\
\ \"mc2_stderr\": 0.015528253088355563\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585188,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6775542720573591,\n\
\ \"acc_stderr\": 0.004664572784985591,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.003445289925011736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.016392221899407065,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.016392221899407065\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046623,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046623\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.012732398286190438,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.012732398286190438\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5879334715417058,\n\
\ \"mc2_stderr\": 0.015528253088355563\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7270659590598939,\n \
\ \"acc_stderr\": 0.012270381151108749\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/Mathral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|arc:challenge|25_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|gsm8k|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hellaswag|10_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-56-24.257713.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T22-56-24.257713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- '**/details_harness|winogrande|5_2024-02-20T22-56-24.257713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T22-56-24.257713.parquet'
- config_name: results
data_files:
- split: 2024_02_20T22_56_24.257713
path:
- results_2024-02-20T22-56-24.257713.parquet
- split: latest
path:
- results_2024-02-20T22-56-24.257713.parquet
---
# Dataset Card for Evaluation run of Eric111/Mathral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/Mathral](https://huggingface.co/Eric111/Mathral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__Mathral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T22:56:24.257713](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mathral/blob/main/results_2024-02-20T22-56-24.257713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377448226701788,
"acc_stderr": 0.032228311298894745,
"acc_norm": 0.6370669885420392,
"acc_norm_stderr": 0.03289642702452909,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5879334715417058,
"mc2_stderr": 0.015528253088355563
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585188,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6775542720573591,
"acc_stderr": 0.004664572784985591,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165623,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165623
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.016392221899407065,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.016392221899407065
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046623,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046623
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190438,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5879334715417058,
"mc2_stderr": 0.015528253088355563
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597214
},
"harness|gsm8k|5": {
"acc": 0.7270659590598939,
"acc_stderr": 0.012270381151108749
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jukaboo/instruct-summary-llama2-de | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 395778657
num_examples: 94617
download_size: 227732676
dataset_size: 395778657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruct-summary-llama2-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DDSC/reddit-da | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- da
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: Reddit-da
---
# Dataset Card for SQuAD-da
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Contributions](#contributions)
### Dataset Summary
This dataset consists of 1,908,887 Danish posts from Reddit. These are from [this Reddit dump](https://files.pushshift.io/reddit/) and have been filtered using [this script](https://github.com/NBAiLab/notram/blob/master/corpus_generation_scripts/lang_detect_reddit.py), which uses FastText to detect the Danish posts.
### Supported Tasks and Leaderboards
This dataset is suitable for language modelling.
### Languages
This dataset is in Danish.
## Dataset Structure
### Data Instances
Every entry in the dataset contains short Reddit comments in Danish, along with a unique ID.
### Data Fields
An entry in the dataset consists of the following fields:
- `id` (`str`): A unique identifier.
- `text` (`str`): A short Reddit comment.
## Additional Information
### Licensing Information
The dataset is released under the MIT license.
### Contributions
Thanks to [@saattrupdan](https://github.com/saattrupdan) for adding this dataset to the Hugging Face Hub. |
anzorq/kbd-ru | ---
language:
- kbd
- ru
license: mit
task_categories:
- translation
- text2text-generation
pretty_name: Circassian (Kabardian) - Russian sentence pairs
tags:
- translation
- text2text-generation
--- |
Rightly/Classifier_unsplit_emails | ---
dataset_info:
features:
- name: template_name
dtype: string
- name: content
dtype: string
- name: policy_id
dtype: string
- name: policy_id_regex
dtype: string
- name: renewal_date
dtype: string
- name: renewal_date_regex
dtype: string
- name: start_date
dtype: string
- name: start_date_regex
dtype: string
- name: category
dtype: string
- name: category_regex
dtype: string
- name: date_generated
dtype: string
splits:
- name: train
num_bytes: 404126649
num_examples: 21000
download_size: 78797361
dataset_size: 404126649
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
argilla/spacy_sm_wnut17 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-CARDINAL
'2': I-CARDINAL
'3': B-DATE
'4': I-DATE
'5': B-EVENT
'6': I-EVENT
'7': B-FAC
'8': I-FAC
'9': B-GPE
'10': I-GPE
'11': B-LAW
'12': I-LAW
'13': B-LOC
'14': I-LOC
'15': B-MONEY
'16': I-MONEY
'17': B-NORP
'18': I-NORP
'19': B-ORDINAL
'20': I-ORDINAL
'21': B-ORG
'22': I-ORG
'23': B-PERCENT
'24': I-PERCENT
'25': B-PERSON
'26': I-PERSON
'27': B-QUANTITY
'28': I-QUANTITY
'29': B-TIME
'30': I-TIME
'31': B-WORK_OF_ART
'32': I-WORK_OF_ART
splits:
- name: train
num_bytes: 39558.31543624161
num_examples: 119
- name: test
num_bytes: 9972.68456375839
num_examples: 30
download_size: 19265
dataset_size: 49531.0
---
# Dataset Card for "spacy_sm_wnut17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lum-ai/metal-python-synthetic-explanations-gpt4-raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: chunk_id
dtype: string
- name: model_name
dtype: string
- name: temperature
dtype: int64
- name: max_tokens
dtype: float64
- name: use_raw_code
dtype: bool
- name: description
dtype: string
- name: created_at
dtype: timestamp[ns]
- name: raw_text
dtype: string
- name: text
dtype: string
- name: code
dtype: string
- name: kind
dtype: string
- name: start_text
dtype: int64
- name: stop_text
dtype: int64
- name: start_code
dtype: int64
- name: stop_code
dtype: int64
- name: domain
dtype: string
- name: full_name
dtype: string
- name: license
struct:
- name: key
dtype: string
- name: name
dtype: string
- name: node_id
dtype: string
- name: spdx_id
dtype: string
- name: url
dtype: string
- name: stargazers_count
dtype: int64
- name: filename
dtype: string
- name: chunk_type
dtype: string
splits:
- name: train
num_bytes: 2771369932.206809
num_examples: 300092
- name: validation
num_bytes: 167612875.8429717
num_examples: 18272
- name: test
num_bytes: 324461765.3020142
num_examples: 35131
download_size: 75623364
dataset_size: 3263444573.351795
---
# Dataset Card for "metal-python-synthetic-explanations-gpt4-raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/vermeil_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vermeil/ヴァーミル/红云 (Arknights)
This is the dataset of vermeil/ヴァーミル/红云 (Arknights), containing 251 images and their tags.
The core tags of this character are `animal_ears, fox_ears, short_hair, blonde_hair, notched_ear, animal_ear_fluff, hair_ornament, hairclip, fox_girl, orange_eyes, hair_between_eyes, tail, fox_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 251 | 344.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vermeil_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 251 | 295.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vermeil_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 629 | 611.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vermeil_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vermeil_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, black_shorts, hood_down, looking_at_viewer, solo, white_shirt, midriff, short_shorts, arrow_(projectile), holding_bow_(weapon), black_gloves, fingerless_gloves, navel, prosthetic_arm, cloak, oripathy_lesion_(arknights), crop_top, bandaged_arm, hooded_cape, quiver, black_cape, standing, simple_background, tooth_necklace, full_body, closed_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, bandaged_arm, black_shorts, hooded_cape, looking_at_viewer, midriff, navel, short_shorts, simple_background, solo, strapless_shirt, tooth_necklace, white_shirt, hood_down, oripathy_lesion_(arknights), prosthetic_arm, black_gloves, blush, crop_top, fingerless_gloves, tube_top, black_cape, cowboy_shot, fang, white_background, ear_piercing, hooded_cloak, open_mouth, standing, stomach |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_shorts, crop_top, looking_at_viewer, midriff, navel, necklace, oripathy_lesion_(arknights), prosthetic_arm, short_shorts, small_breasts, solo, strapless_shirt, white_shirt, bandaged_arm, blush, stomach, tube_top, black_gloves, fang, sitting, armpits, bandaged_leg, cape, fingerless_gloves, hand_up, open_mouth, parted_lips, red_eyes, scar, simple_background, thigh_strap, white_background |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, open_mouth, oripathy_lesion_(arknights), penis, small_breasts, vaginal, loli, spread_legs, cum_in_pussy, heart_censor, indoors, nude, prosthetic_arm, single_glove, solo_focus, bandages, black_gloves, ear_piercing, fang, feet, girl_on_top, jewelry, mosaic_censoring, sex_from_behind, soles, testicles, toes, tongue_out |
| 4 | 5 |  |  |  |  |  | 1girl, blush, simple_background, small_breasts, solo, arm_strap, collarbone, full_body, looking_at_viewer, navel, nude, red_eyes, white_background, black_gloves, fang, nipples, open_mouth, oripathy_lesion_(arknights), single_elbow_glove, sweat, asymmetrical_gloves, bar_censor, bound, closed_mouth, gradient_hair, jewelry, kneeling, pussy, restrained, single_glove, skindentation, tears, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shorts | hood_down | looking_at_viewer | solo | white_shirt | midriff | short_shorts | arrow_(projectile) | holding_bow_(weapon) | black_gloves | fingerless_gloves | navel | prosthetic_arm | cloak | oripathy_lesion_(arknights) | crop_top | bandaged_arm | hooded_cape | quiver | black_cape | standing | simple_background | tooth_necklace | full_body | closed_mouth | strapless_shirt | blush | tube_top | cowboy_shot | fang | white_background | ear_piercing | hooded_cloak | open_mouth | stomach | bare_shoulders | necklace | small_breasts | sitting | armpits | bandaged_leg | cape | hand_up | parted_lips | red_eyes | scar | thigh_strap | 1boy | hetero | nipples | penis | vaginal | loli | spread_legs | cum_in_pussy | heart_censor | indoors | nude | single_glove | solo_focus | bandages | feet | girl_on_top | jewelry | mosaic_censoring | sex_from_behind | soles | testicles | toes | tongue_out | arm_strap | collarbone | single_elbow_glove | sweat | asymmetrical_gloves | bar_censor | bound | gradient_hair | kneeling | pussy | restrained | skindentation | tears | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------|:--------------------|:-------|:--------------|:----------|:---------------|:---------------------|:-----------------------|:---------------|:--------------------|:--------|:-----------------|:--------|:------------------------------|:-----------|:---------------|:--------------|:---------|:-------------|:-----------|:--------------------|:-----------------|:------------|:---------------|:------------------|:--------|:-----------|:--------------|:-------|:-------------------|:---------------|:---------------|:-------------|:----------|:-----------------|:-----------|:----------------|:----------|:----------|:---------------|:-------|:----------|:--------------|:-----------|:-------|:--------------|:-------|:---------|:----------|:--------|:----------|:-------|:--------------|:---------------|:---------------|:----------|:-------|:---------------|:-------------|:-----------|:-------|:--------------|:----------|:-------------------|:------------------|:--------|:------------|:-------|:-------------|:------------|:-------------|:---------------------|:--------|:----------------------|:-------------|:--------|:----------------|:-----------|:--------|:-------------|:----------------|:--------|:-------------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | X | X | | X | X | X | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | X | X | | | X | X | X | X | | X | X | X | | | | | X | | | | X | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | | | | | | X | | X | X | | X | | | | | | | | | | | | X | | | X | | X | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | X | | | | | | X | | X | | | X | | | | | | | X | | X | X | | X | | | X | X | | | X | | | | X | | | | | | | X | | | | | X | | | | | | | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
IljaSamoilov/ERR-transcription-to-subtitles | ---
license: afl-3.0
---
This dataset is created by Ilja Samoilov. In dataset is tv show subtitles from ERR and transcriptions of those shows created with TalTech ASR.
```
from datasets import load_dataset, load_metric
dataset = load_dataset('csv', data_files={'train': "train.tsv", \
"validation":"val.tsv", \
"test": "test.tsv"}, delimiter='\t')
``` |
aisuko/quora_questions_raw | ---
license: apache-2.0
language:
- en
---
Only for reseaching purpose.
Adapter: Aisuko
More detail see https://www.kaggle.com/code/aisuko/distribution-compute-of-quora-questions-embeddings |
Cuplex/autotrain-data-image-classification | ---
task_categories:
- image-classification
license: apache-2.0
tags:
- flowers
pretty_name: Autotrained Flowers
---
# AutoTrain Dataset for project: image-classification
## Dataset Description
This dataset has been automatically processed by AutoTrain for project image-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<500x333 RGB PIL image>",
"target": 0
},
{
"image": "<320x240 RGB PIL image>",
"target": 4
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=5, names=['daisy', 'dandelion', 'roses', 'sunflowers', 'tulips'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 160 |
| valid | 40 | |
ganader-ia-developers/el_talar_febrero24 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: filtered
data_files:
- split: train
path: filtered/train-*
dataset_info:
- config_name: default
features:
- name: image
dtype: image
- name: cow_id
dtype: int64
- name: weight
dtype: string
- name: source
dtype: string
- name: breed
dtype: string
- name: sex
dtype: string
- name: orientation
dtype: string
- name: internal_cow_id
dtype: string
- name: vertical_distance_meters
dtype: float64
- name: horizontal_distance_meters
dtype: float64
- name: picture_quality
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 12063030422.09
num_examples: 2999
download_size: 11177066135
dataset_size: 12063030422.09
- config_name: filtered
features:
- name: image
dtype: image
- name: cow_id
dtype: int64
- name: weight
dtype: string
- name: source
dtype: string
- name: breed
dtype: string
- name: sex
dtype: string
- name: orientation
dtype: string
- name: internal_cow_id
dtype: string
- name: vertical_distance_meters
dtype: float64
- name: horizontal_distance_meters
dtype: float64
- name: picture_quality
dtype: string
- name: id
dtype: string
- name: hash
dtype: string
splits:
- name: train
num_bytes: 1388690525.173
num_examples: 1247
download_size: 1388651261
dataset_size: 1388690525.173
---
# Dataset Card for "el_talar_febrero24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
semeru/code-code-translation-java-csharp | ---
license: mit
Programminglanguage: "Java/C#"
version: "N/A"
Date: "Most likely 2020"
Contaminated: "Very Likely"
Size: "Standard Tokenizer"
---
### Dataset is imported from CodeXGLUE and pre-processed using their script.
# Where to find in Semeru:
The dataset can be found at /nfs/semeru/semeru_datasets/code_xglue/code-to-code/code-to-code-trans in Semeru
# CodeXGLUE -- Code2Code Translation
## Task Definition
Code translation aims to migrate legacy software from one programming language in a platform toanother.
In CodeXGLUE, given a piece of Java (C#) code, the task is to translate the code into C# (Java) version.
Models are evaluated by BLEU scores, accuracy (exactly match), and [CodeBLEU](https://github.com/microsoft/CodeXGLUE/blob/main/code-to-code-trans/CodeBLEU.MD) scores.
## Dataset
The dataset is collected from several public repos, including Lucene(http://lucene.apache.org/), POI(http://poi.apache.org/), JGit(https://github.com/eclipse/jgit/) and Antlr(https://github.com/antlr/).
We collect both the Java and C# versions of the codes and find the parallel functions. After removing duplicates and functions with the empty body, we split the whole dataset into training, validation and test sets.
### Data Format
The dataset is in the "data" folder. Each line of the files is a function, and the suffix of the file indicates the programming language.
### Data Statistics
Data statistics of the dataset are shown in the below table:
| | #Examples |
| ------- | :-------: |
| Train | 10,300 |
| Valid | 500 |
| Test | 1,000 | |
edbeeching/prj_gia_dataset_atari_2B_atari_choppercommand_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_choppercommand environment, sample for the policy atari_2B_atari_choppercommand_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
Corky/BalkiaAi | ---
license: other
---
|
open-llm-leaderboard/details_DeepKarkhanis__Mistral-Passthrough-8L-10B | ---
pretty_name: Evaluation run of DeepKarkhanis/Mistral-Passthrough-8L-10B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DeepKarkhanis/Mistral-Passthrough-8L-10B](https://huggingface.co/DeepKarkhanis/Mistral-Passthrough-8L-10B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DeepKarkhanis__Mistral-Passthrough-8L-10B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T16:57:03.091250](https://huggingface.co/datasets/open-llm-leaderboard/details_DeepKarkhanis__Mistral-Passthrough-8L-10B/blob/main/results_2024-01-10T16-57-03.091250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445269708058093,\n\
\ \"acc_stderr\": 0.03218714474134609,\n \"acc_norm\": 0.6449418405596148,\n\
\ \"acc_norm_stderr\": 0.03284511879516387,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\
\ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518829\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n\
\ \"acc_stderr\": 0.0046918486653990685,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.003445289925011734\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\
\ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \
\ \"acc_stderr\": 0.012824066621488845\n }\n}\n```"
repo_url: https://huggingface.co/DeepKarkhanis/Mistral-Passthrough-8L-10B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-57-03.091250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-57-03.091250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- '**/details_harness|winogrande|5_2024-01-10T16-57-03.091250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T16-57-03.091250.parquet'
- config_name: results
data_files:
- split: 2024_01_10T16_57_03.091250
path:
- results_2024-01-10T16-57-03.091250.parquet
- split: latest
path:
- results_2024-01-10T16-57-03.091250.parquet
---
# Dataset Card for Evaluation run of DeepKarkhanis/Mistral-Passthrough-8L-10B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DeepKarkhanis/Mistral-Passthrough-8L-10B](https://huggingface.co/DeepKarkhanis/Mistral-Passthrough-8L-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DeepKarkhanis__Mistral-Passthrough-8L-10B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T16:57:03.091250](https://huggingface.co/datasets/open-llm-leaderboard/details_DeepKarkhanis__Mistral-Passthrough-8L-10B/blob/main/results_2024-01-10T16-57-03.091250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445269708058093,
"acc_stderr": 0.03218714474134609,
"acc_norm": 0.6449418405596148,
"acc_norm_stderr": 0.03284511879516387,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518829
},
"harness|hellaswag|10": {
"acc": 0.6701852220673172,
"acc_stderr": 0.0046918486653990685,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.012824066621488845
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
andrewatef/PTexttt | ---
dataset_info:
features:
- name: input
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 101496004.9890677
num_examples: 120441
- name: test
num_bytes: 43498649.0109323
num_examples: 51618
download_size: 74085380
dataset_size: 144994654.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
princeton-nlp/SWE-bench_Lite_bm25_13K | ---
dataset_info:
features:
- name: instance_id
dtype: string
- name: text
dtype: string
- name: repo
dtype: string
- name: base_commit
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
splits:
- name: dev
num_bytes: 1402179
num_examples: 23
- name: test
num_bytes: 18207667
num_examples: 300
download_size: 8579282
dataset_size: 19609846
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
### Dataset Summary
SWE-bench *Lite* is _subset_ of SWE-bench, a dataset that tests systems’ ability to solve GitHub issues automatically. The dataset collects 300 test Issue-Pull Request pairs from 11 popular Python. Evaluation is performed by unit test verification using post-PR behavior as the reference solution.
The dataset was released as part of [SWE-bench: Can Language Models Resolve Real-World GitHub Issues?](https://arxiv.org/abs/2310.06770)
This dataset `SWE-bench_Lite_bm25_13K` includes a formatting of each instance using Pyserini's BM25 retrieval as described in the paper. The code context size limit is 13,000 `cl100k_base` tokens from the [`tiktoken`](https://github.com/openai/tiktoken) tokenization package used for OpenAI models.
The `text` column can be used directly with LMs to generate patch files.
Models are instructed to generate [`patch`](https://en.wikipedia.org/wiki/Patch_(Unix)) formatted file using the following template:
```diff
<patch>
diff
--- a/path/to/file.py
--- b/path/to/file.py
@@ -1,3 +1,3 @@
This is a test file.
-It contains several lines.
+It has been modified.
This is the third line.
</patch>
```
This format can be used directly with the [SWE-bench inference scripts](https://github.com/princeton-nlp/SWE-bench/tree/main/inference). Please refer to these scripts for more details on inference.
### Supported Tasks and Leaderboards
SWE-bench proposes a new task: issue resolution provided a full repository and GitHub issue. The leaderboard can be found at www.swebench.com
### Languages
The text of the dataset is primarily English, but we make no effort to filter or otherwise clean based on language type.
## Dataset Structure
### Data Instances
An example of a SWE-bench datum is as follows:
|
crumb/textbook-codex-oai-0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: src
dtype: string
- name: src_col
dtype: string
- name: model
dtype: string
splits:
- name: train
num_bytes: 100059225.10238275
num_examples: 29265
download_size: 521517482
dataset_size: 100059225.10238275
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "textbook-codex-oai-0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guyhadad01/manipulations_multi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 44449
num_examples: 263
- name: test
num_bytes: 11084
num_examples: 66
download_size: 22617
dataset_size: 55533
---
# Dataset Card for "manipulations_multi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hongyin/pretrain-sample | ---
license: mit
language:
- zh
- en
pretty_name: hongyin/pretrain
task_categories:
- text-generation
size_categories:
- n<1K
---
# Pretrain
## Dataset details
**License:** |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c66464e9 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1340
dataset_size: 186
---
# Dataset Card for "c66464e9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mutonix/Vript | ---
task_categories:
- video-classification
- visual-question-answering
language:
- en
size_categories:
- 10K<n<100K
---
# 🎬 Vript: Refine Video Captioning into Video Scripting
---
We construct a **fine-grained** video-text dataset with 12K annotated high-resolution videos **(~400k clips)**. The annotation of this dataset is inspired by the video script. If we want to make a video, we have to first write a script to organize how to shoot the scenes in the videos. To shoot a scene, we need to decide the content, shot type (medium shot, close-up, etc), and how the camera moves (panning, tilting, etc). Therefore, we extend video captioning to video scripting by annotating the videos in the format of video scripts. Different from the previous video-text datasets, we densely annotate the entire videos without discarding any scenes and each scene has a caption with **~145** words. Besides the vision modality, we transcribe the voice-over into text and put it along with the video title to give more background information for annotating the videos.
<p align="center">
<img src="assets/Vript-overview_00.png" width="800">
</p>
## Getting Started
**By downloading these datasets, you agree to the terms of the [License](#License).**
The captions of the videos in the Vript dataset are structured as follows:
```
{
"meta": {
"video_id": "339dXVNQXac",
"video_title": "...",
"num_clips": ...,
"integrity": true,
},
"data": {
"339dXVNQXac-Scene-001": {
"video_id": "339dXVNQXac",
"clip_id": "339dXVNQXac-Scene-001",
"video_title": "...",
"caption":{
"shot_type": "...",
"camera_movement": "...",
"content": "...",
"scene_title": "...",
},
"voiceover": ["..."],
},
"339dXVNQXac-Scene-002": {
...
}
}
}
```
- `video_id`: The ID of the video from YouTube.
- `video_title`: The title of the video.
- `num_clips`: The number of clips in the video. If the `integrity` is `false`, some clips may not be captioned.
- `integrity`: Whether all clips are captioned.
- `clip_id`: The ID of the clip in the video, which is the concatenation of the `video_id` and the scene number.
- `caption`: The caption of the scene, including the shot type, camera movement, content, and scene title.
- `voiceover`: The transcription of the voice-over in the scene.
The data is organized as follows:
```
Vript/
|
├── vript_meta/
│ ├── vript_long_videos_meta.json
│ └── vript_short_videos_meta.json
│
├── vript_captions/
│ ├── vript_long_videos_captions.zip
│ │ ├── 007EvOaWFOA_caption.json
│ │ └── ...
│ └── vript_short_videos_captions.zip
│ └── ...
│
├── vript_long_videos/
│ ├── video_1_of_1095.zip
│ │ ├── 007EvOaWFOA.mp4
│ │ └── ...
│ ├── video_2_of_1095.zip
│ └── ...
│
├── vript_short_videos/
│ ├── short_video_1_of_42.zip
│ │ ├── 02toZL7p4_0.mp4
│ │ └── ...
│ ├── short_video_2_of_42.zip
│ └── ...
│
├── vript_long_videos_clips/
│ ├── clips_1_of_1095.zip
│ │ ├── 007EvOaWFOA/
│ │ │ ├── 007EvOaWFOA_cut_meta.json
│ │ │ ├── 007EvOaWFOA_asr.jsonl
│ │ │ ├── 007EvOaWFOA-Scene-001.mp4
│ │ │ └── ...
│ │ └── ...
│ ├── clips_2_of_1095.zip
│ └── ...
│
└── vript_short_videos_clips/
├── shorts_clips_1_of_42.zip
│ ├── 02toZL7p4_0/
│ │ ├── 02toZL7p4_0_cut_meta.json
│ │ ├── 02toZL7p4_0_asr.jsonl
│ │ ├── 02toZL7p4_0-Scene-001.mp4
│ │ └── ...
│ └── ...
├── shorts_clips_2_of_42.zip
└── ...
```
- `vript_meta/`: The meta information of the videos in the Vript dataset, including the video id, title, url, description, category, etc.
- `vript_captions/`: The video captions of the videos in the Vript dataset, which are structured as described above.
- `vript_long_videos/` (667 GB) and `vript_short_videos/` (8.8 GB): The untrimmed videos in the Vript dataset. Long videos are from YouTube, and short videos are from YouTube Shorts and TikTok. We divide the whole data into multiple zip files, each containing 10 long videos / 50 short videos.
All the videos are in **720p** resolution, and _we will provide the videos in the highest quality (up to 2K) available later_ (or you can download them from YouTube directly).
- `vript_long_videos_clips/` (822 GB) and `vript_short_videos_clips/` (12 GB): The trimmed video clips in the Vript dataset, which correspond to scenes in the `video_captions`.
- `xxx_cut_meta.json`: The meta information about how the video is trimmed, including the start time, end time, and the duration of the scene.
- `xxx_asr.jsonl`: The transcription of the voice-over in the scene.
_Warning: Some zip files may contain empty folders. You can ignore them as these folders have no video clips and no annotation files._
## License
By downloading or using the data or model, you understand, acknowledge, and agree to all the terms in the following agreement.
- ACADEMIC USE ONLY
Any content from Vript/Vript-Bench dataset and Vriptor model is available for academic research purposes only. You agree not to reproduce, duplicate, copy, trade, or exploit for any commercial purposes
- NO DISTRIBUTION
Respect the privacy of personal information of the original source. Without the permission of the copyright owner, you are not allowed to perform any form of broadcasting, modification or any other similar behavior to the data set content.
- RESTRICTION AND LIMITATION OF LIABILITY
In no event shall we be liable for any other damages whatsoever arising out of the use of, or inability to use this dataset and its associated software, even if we have been advised of the possibility of such damages.
- DISCLAIMER
You are solely responsible for legal liability arising from your improper use of the dataset content. We reserve the right to terminate your access to the dataset at any time. You should delete the Vript/Vript-Bench dataset or Vriptor model if required.
This license is modified from the [HD-VG-100M](https://github.com/daooshee/HD-VG-130M) license.
<!-- ## Citation
```
``` -->
## Contact
**Dongjie Yang**: [djyang.tony@sjtu.edu.cn](djyang.tony@sjtu.edu.cn) |
d2mw/thepiratebay-categorized-titles-2023-04 | ---
task_categories:
- text-classification
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a set of (title, integer category) descriptions taken from The Pirate Bay via
[123dw's](https://thepiratebay.org/search.php?q=user:123dw) regular TPB backups. This set represents the titles in release 2023-04.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
Major category, count
* 1, 733604 (audio)
* 2, 3557282 (video)
* 3, 211288 (applications)
* 4, 245684 (games)
* 5, 2500830 (porn)
* 6, 515778 (other)
Is porn?, count
- 0, 5263636
- 1, 2500830
### Data Fields
* id - original torrent ID
* title - Torrent title
* category - Integer ThePirateBay category (see below)
* mcat - Integer category / 100
* is_porn - 1 if porn, 0 otherwise
### Categories
```
id,name
100,Audio
101,"Audio: Music"
102,"Audio: Audio books"
103,"Audio: Sound clips"
104,"Audio: FLAC"
199,"Audio: Other"
200,Video
201,"Video: Movies"
202,"Video: Movies DVDR"
203,"Video: Music videos"
204,"Video: Movie clips"
205,"Video: TV shows"
206,"Video: Handheld"
207,"Video: HD - Movies"
208,"Video: HD - TV shows"
209,"Video: 3D"
299,"Video: Other"
300,Applications
301,"Applications: Windows"
302,"Applications: Mac"
303,"Applications: UNIX"
304,"Applications: Handheld"
305,"Applications: IOS (iPad/iPhone)"
306,"Applications: Android"
399,"Applications: Other OS"
400,Games
401,"Games: PC"
402,"Games: Mac"
403,"Games: PSx"
404,"Games: XBOX360"
405,"Games: Wii"
406,"Games: Handheld"
407,"Games: IOS (iPad/iPhone)"
408,"Games: Android"
499,"Games: Other"
500,Porn
501,"Porn: Movies"
502,"Porn: Movies DVDR"
503,"Porn: Pictures"
504,"Porn: Games"
505,"Porn: HD - Movies"
506,"Porn: Movie clips"
599,"Porn: Other"
600,Other
601,"Other: E-books"
602,"Other: Comics"
603,"Other: Pictures"
604,"Other: Covers"
605,"Other: Physibles"
699,"Other: Other"
```
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
euclaise/thevault-filtered | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: repo
dtype: string
- name: path
dtype: string
- name: license
sequence: string
- name: language
dtype: string
- name: identifier
dtype: string
- name: return_type
dtype: string
- name: original_string
dtype: string
- name: original_docstring
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: short_docstring
dtype: string
- name: short_docstring_tokens
sequence: string
- name: comment
sequence: string
- name: parameters
list:
- name: param
dtype: string
- name: type
dtype: string
- name: docstring_params
struct:
- name: returns
list:
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: type
dtype: string
- name: raises
list:
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: type
dtype: string
- name: params
list:
- name: identifier
dtype: string
- name: type
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: default
dtype: string
- name: is_optional
dtype: bool
- name: outlier_params
list:
- name: identifier
dtype: string
- name: type
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: default
dtype: string
- name: is_optional
dtype: bool
- name: others
list:
- name: identifier
dtype: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: code_with_imports
dtype: string
- name: idxs
dtype: int64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 1555988881.6663418
num_examples: 544627
download_size: 773215769
dataset_size: 1555988881.6663418
license: mit
task_categories:
- text-generation
---
# Dataset Card for "thevault-filtered"
Filtered version of [The Vault (function)](https://huggingface.co/datasets/Fsoft-AIC/the-vault-function). Restricted only to Python, then:
- Light AST filtering for self-contained functions
- Run through CodeBERT embeddings, clustered with k-means to 1024 clusters, and then the clusters were manually skimmed for seemingly uninformative functions.
The clusters excluded and their reasons are as follows:
```
excluded = [
4, # biochem stuff? DEcompiled code
9, # Empty functions
33, # Empty functions
34, # UI stuff, just returns arguments
37, # Empty functions
40, # Empty functions
42, # Empty functions
44, # _namespace_SIO stuff
55, # Trivial, e.g. add(a, b) = a + b
66, # find_by class methods
67, # Mostly methods, seems not very informative
77, # openapi_types, returns a fixed dictionary
78, # Minimal, method stuff
83, # Locale configuration
87, # Just returns argument
101, # Incomplete
102, # Class methods
108, # openapi_types
156, # Empty functions
164, # Trivial, function aliases
168, # Class methods
172, # Empty functions
173, # Class methods
175, # Class methods
181, # Empty functions
182, # Fixed API stuff
190, # Fixed specific stuff
197, # from_dictionary class methods
198, # Empty functions
234, # Unimplemented
246, # Fixed specific stuff
277, # Empty functions
280, # Empty functions
282, # Empty functions
287, # Trivial, e.g. helloWorld()
299, # Mostly unfinished
304, # Empty functions
310, # Fixed API stuff
313, # Just modifies globals
320, # Empty functions
329, # Takes a credentials object, and runs methods on it
332, # MangoPi bot
334, # Empty
338, # namespace_SIO nonsense
339, # fn(x) = x
363, # Empty functions
370, # Empty
379, # Empty
388, # Empty
392, # Empty functions
393, # Fixed lists
409, # Fixed dictionaries
416, # Aliases to print
428, # Empty functions
437, # Empty functions
444, # Empty
454, # Mostly just calls methods on arguments
463, # Mostly just calls methods on arguments
470, # Fixed dictionaries
474, # Mostly fixed printing
465, # OpenAPI fixed dictionaries
476, # Empty
477, # Fixed dictionaries
491, # Trivial
494, # Lots of fixed string stuff
496, # Empty
511, # Empty
518, # OpenAPI
521, # Fixed API stuff
536, # Empty
540, # Fixed API stuff
553, # Empty
555, # Empty
564, # Empty
566, # Empty
568, # cls methods
573, # Mostly fixed dict stuff
574, # namespace_SO stuff, more biochem?
582, # namespace_SO stuff, more biochem?
602, # Fixed lists
608, # Mostly cls methods
617, # Mostly cls methods
629, # cls methods, fixed lists
641, # Fixed API stuff
642, # Empty
647, # Windows API stuff
648, # jupyter stuff
649, # mostly fixed dicts
652, # Empty
660, # Empty
665, # cls methods
666, # Empty
672, # Empty
680, # fixed dicts
682, # Empty
686, # Empty
687, # Fixed lists elements_sequence
692, # cls methods
693, # ASCII art
704, # Empty
709, # mqtt send message
712, # Empty
715, # Fixed data recoding
717, # Empty
722, # cls methods
725, # cls methods
734, # cls methods
737, # Empty
741, # Trivial cls methods
742, # Empty
745, # Fixed strings
752, # Empty
758, # Mostly fixed printing
768, # Empty
783, # Empty
784, # Mostly fixed dicts
802, # Fixed printing
806, # Empty
821, # Empty
824, # stuff like load_performance_win_x64_win_x64_vs2017_settings
825, # Trivial
835, # Empty
851, # Empty
862, # Empty
876, # Trivial
878, # Empty
887, # Empty
888, # Mostly fixed dicts
890, # Mostly fixed dicts
893, # Empty
898, # cls methods
899, # Fixed ['str'] stuff
906, # Auto-generated or something
912, # Empty
924, # Empty
933, # namespace_SO biochem stuff
938, # Trivial
959, # Mostly fixed printing
963, # API-specific
965, # cls methods
967, # cls methods
970, # Mostly fixed printing
971, # cls methods
972, # cls methods
973, # Empty
979, # cls methods
982, # Empty
983, # Empty
989, # cls methods
990, # API specific
1007, # API specific
1014, # Empty
]
```
MIT licensed, like the original dataset |
Praghxx/Bryan | ---
license: openrail
---
|
bala1524/drug-comb-data | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- biology
- medical
pretty_name: drug comb
size_categories:
- n<1K
--- |
DBQ/My.Theresa.Product.prices.United.Kingdom | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United Kingdom - My Theresa - Product-level price list
tags:
- webscraping
- ecommerce
- My Theresa
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 34106714
num_examples: 97884
download_size: 10195453
dataset_size: 34106714
---
# My Theresa web scraped data
## About the website
My Theresa is a major player in the **Ecommerce** industry in **EMEA**, with a strong presence in the **United Kingdom**. The **online luxury Fashion** business has experienced significant growth over the years, driven by advancements in technology, rising disposable income, increased internet penetration, and changing consumer preferences. Ecommerce is ever-evolving; with an amalgamation of technological disruption, competitive dynamics, and a shift in consumer behavior making its impact felt on the Ecommerce landscape in the United Kingdom. **My Theresa** focuses on digitally enabled direct-to-consumer boutique experiences. The dataset observed has **Ecommerce product-list page (PLP) data** on My Theresa in the United Kingdom which gives a snapshot of its offerings.
## Link to **dataset**
[United Kingdom - My Theresa - Product-level price list dataset](https://www.databoutique.com/buy-data-page/My%20Theresa%20Product-prices%20United%20Kingdom/r/recPYZC2plm5PtCch)
|
steciuk/imdb | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 52901123
num_examples: 40000
download_size: 34391296
dataset_size: 52901123
---
# Dataset Card for "imdb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7 | ---
pretty_name: Evaluation run of andysalerno/openchat-nectar-0.7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/openchat-nectar-0.7](https://huggingface.co/andysalerno/openchat-nectar-0.7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T04:44:01.094706](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7/blob/main/results_2024-01-21T04-44-01.094706.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533452700527564,\n\
\ \"acc_stderr\": 0.03187068672960971,\n \"acc_norm\": 0.654124427390922,\n\
\ \"acc_norm_stderr\": 0.032524509376303544,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5204520312017102,\n\
\ \"mc2_stderr\": 0.015323853661186408\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192598,\n\
\ \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6334395538737303,\n\
\ \"acc_stderr\": 0.004808802114592841,\n \"acc_norm\": 0.8300139414459271,\n\
\ \"acc_norm_stderr\": 0.0037485288878381247\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859933,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859933\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400572,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400572\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n\
\ \"acc_stderr\": 0.012767457253930643,\n \"acc_norm\": 0.4895697522816167,\n\
\ \"acc_norm_stderr\": 0.012767457253930643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5204520312017102,\n\
\ \"mc2_stderr\": 0.015323853661186408\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836703\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/openchat-nectar-0.7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-44-01.094706.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- '**/details_harness|winogrande|5_2024-01-21T04-44-01.094706.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T04-44-01.094706.parquet'
- config_name: results
data_files:
- split: 2024_01_21T04_44_01.094706
path:
- results_2024-01-21T04-44-01.094706.parquet
- split: latest
path:
- results_2024-01-21T04-44-01.094706.parquet
---
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.7](https://huggingface.co/andysalerno/openchat-nectar-0.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T04:44:01.094706](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7/blob/main/results_2024-01-21T04-44-01.094706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533452700527564,
"acc_stderr": 0.03187068672960971,
"acc_norm": 0.654124427390922,
"acc_norm_stderr": 0.032524509376303544,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5204520312017102,
"mc2_stderr": 0.015323853661186408
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192598,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177275
},
"harness|hellaswag|10": {
"acc": 0.6334395538737303,
"acc_stderr": 0.004808802114592841,
"acc_norm": 0.8300139414459271,
"acc_norm_stderr": 0.0037485288878381247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741624,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859933,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859933
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.02357688174400572,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.02357688174400572
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4895697522816167,
"acc_stderr": 0.012767457253930643,
"acc_norm": 0.4895697522816167,
"acc_norm_stderr": 0.012767457253930643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174937,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174937
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5204520312017102,
"mc2_stderr": 0.015323853661186408
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.01094187795567621
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lshowway/wikipedia.reorder.natural.de | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2385745587
num_examples: 1137317
download_size: 0
dataset_size: 2385745587
---
# Dataset Card for "wikipedia.reorder.natural.de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.