datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
BangumiBase/nightshiftnurses | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Night Shift Nurses
This is the image base of bangumi Night Shift Nurses, we detected 7 characters, 296 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 62 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 49 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 57 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 27 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 12 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 15 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 74 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
dlibf/metamathqa_formatted | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 294158314.68253165
num_examples: 394900
- name: test_sft
num_bytes: 74489.31746835443
num_examples: 100
download_size: 129446994
dataset_size: 294232804.0
---
# Dataset Card for "metamathqa_formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Feanix/gtzan-5-sec | ---
pretty_name: GTZAN
task_categories:
- audio-classification
tags:
- music
size_categories:
- 1K<n<10K
---
# Dataset Card for GTZAN
## Table of Contents
- [Dataset Card for GTZAN](#dataset-card-for-gtzan)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://marsyas.info/downloads/datasets.html](http://marsyas.info/downloads/datasets.html)
- **Paper:** [http://ismir2001.ismir.net/pdf/tzanetakis.pdf](http://ismir2001.ismir.net/pdf/tzanetakis.pdf)
- **Point of Contact:**
### Dataset Summary
GTZAN is a dataset for musical genre classification of audio signals. The dataset consists of 1,000 audio tracks, each of 30 seconds long. It contains 10 genres, each represented by 100 tracks. The tracks are all 22,050Hz Mono 16-bit audio files in WAV format. The genres are: blues, classical, country, disco, hiphop, jazz, metal, pop, reggae, and rock.
*** THIS VERSION OF THE DATASET CONTAINS THE ORIGINAL AUDIO TRACKS SEGMENTED INTO 5 SECOND LONG FILES ***
### Languages
English
## Dataset Structure
GTZAN is distributed as a single dataset without a predefined training and test split. The information below refers to the single `train` split that is assigned by default.
### Data Instances
An example of GTZAN looks as follows:
```python
{
"file": "/path/to/cache/genres/blues/blues.00000.wav",
"audio": {
"path": "/path/to/cache/genres/blues/blues.00000.wav",
"array": array(
[
0.00732422,
0.01660156,
0.00762939,
...,
-0.05560303,
-0.06106567,
-0.06417847,
],
dtype=float32,
),
"sampling_rate": 22050,
},
"genre": 0,
}
```
### Data Fields
The types associated with each of the data fields is as follows:
* `file`: a `string` feature.
* `audio`: an `Audio` feature containing the `path` of the sound file, the decoded waveform in the `array` field, and the `sampling_rate`.
* `genre`: a `ClassLabel` feature.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{tzanetakis_essl_cook_2001,
author = "Tzanetakis, George and Essl, Georg and Cook, Perry",
title = "Automatic Musical Genre Classification Of Audio Signals",
url = "http://ismir2001.ismir.net/pdf/tzanetakis.pdf",
publisher = "The International Society for Music Information Retrieval",
year = "2001"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun) for adding this dataset. |
open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps | ---
pretty_name: Evaluation run of dvruette/oasst-llama-13b-1000-steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/oasst-llama-13b-1000-steps](https://huggingface.co/dvruette/oasst-llama-13b-1000-steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T08:21:45.540153](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps/blob/main/results_2023-10-19T08-21-45.540153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0959521812080537,\n\
\ \"em_stderr\": 0.0030162183550142383,\n \"f1\": 0.16973573825503283,\n\
\ \"f1_stderr\": 0.003251453767412336,\n \"acc\": 0.44401178094667637,\n\
\ \"acc_stderr\": 0.010227191296479903\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0959521812080537,\n \"em_stderr\": 0.0030162183550142383,\n\
\ \"f1\": 0.16973573825503283,\n \"f1_stderr\": 0.003251453767412336\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dvruette/oasst-llama-13b-1000-steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T08_21_45.540153
path:
- '**/details_harness|drop|3_2023-10-19T08-21-45.540153.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T08-21-45.540153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T08_21_45.540153
path:
- '**/details_harness|gsm8k|5_2023-10-19T08-21-45.540153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T08-21-45.540153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:56.824224.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:56.824224.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T08_21_45.540153
path:
- '**/details_harness|winogrande|5_2023-10-19T08-21-45.540153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T08-21-45.540153.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_48_56.824224
path:
- results_2023-07-19T18:48:56.824224.parquet
- split: 2023_10_19T08_21_45.540153
path:
- results_2023-10-19T08-21-45.540153.parquet
- split: latest
path:
- results_2023-10-19T08-21-45.540153.parquet
---
# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/oasst-llama-13b-1000-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/oasst-llama-13b-1000-steps](https://huggingface.co/dvruette/oasst-llama-13b-1000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T08:21:45.540153](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps/blob/main/results_2023-10-19T08-21-45.540153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0959521812080537,
"em_stderr": 0.0030162183550142383,
"f1": 0.16973573825503283,
"f1_stderr": 0.003251453767412336,
"acc": 0.44401178094667637,
"acc_stderr": 0.010227191296479903
},
"harness|drop|3": {
"em": 0.0959521812080537,
"em_stderr": 0.0030162183550142383,
"f1": 0.16973573825503283,
"f1_stderr": 0.003251453767412336
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833073
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
phamtungthuy/phanloaicauhoiphapluat | ---
dataset_info:
features:
- name: question
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 20635817
num_examples: 55527
- name: train
num_bytes: 186721747
num_examples: 523337
download_size: 80518127
dataset_size: 207357564
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for "phanloaicauhoiphapluat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jnlpba | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-genia-v3.02
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: BioNLP / JNLPBA Shared Task 2004
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DNA
'2': I-DNA
'3': B-RNA
'4': I-RNA
'5': B-cell_line
'6': I-cell_line
'7': B-cell_type
'8': I-cell_type
'9': B-protein
'10': I-protein
config_name: jnlpba
splits:
- name: train
num_bytes: 8775707
num_examples: 18546
- name: validation
num_bytes: 1801565
num_examples: 3856
download_size: 3171072
dataset_size: 10577272
---
# Dataset Card for JNLPBA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://www.geniaproject.org/shared-tasks/bionlp-jnlpba-shared-task-2004
- **Repository:** [Needs More Information]
- **Paper:** https://www.aclweb.org/anthology/W04-1213.pdf
- **Leaderboard:** https://paperswithcode.com/sota/named-entity-recognition-ner-on-jnlpba?p=biobert-a-pre-trained-biomedical-language
- **Point of Contact:** [Needs More Information]
### Dataset Summary
The data came from the GENIA version 3.02 corpus (Kim et al., 2003). This was formed from a controlled search on MEDLINE using the MeSH terms human, blood cells and transcription factors. From this search 2,000 abstracts were selected and hand annotated according to a small taxonomy of 48 classes based on a chemical classification. Among the classes, 36 terminal classes were used to annotate the GENIA corpus.
### Supported Tasks and Leaderboards
NER
### Languages
English
## Dataset Structure
### Data Instances
{
'id': '1',
'tokens': ['IL-2', 'gene', 'expression', 'and', 'NF-kappa', 'B', 'activation', 'through', 'CD28', 'requires', 'reactive', 'oxygen', 'production', 'by', '5-lipoxygenase', '.'],
'ner_tags': [1, 2, 0, 0, 9, 10, 0, 0, 9, 0, 0, 0, 0, 0, 9, 0],
}
### Data Fields
- `id`: Sentence identifier.
- `tokens`: Array of tokens composing a sentence.
- `ner_tags`: Array of tags, where `0` indicates no bio-entity mentioned, `1` signals the first token of a bio-entity and `2` the subsequent bio-entity tokens.
### Data Splits
Train samples: 37094
Validation samples: 7714
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
@inproceedings{collier-kim-2004-introduction,
title = "Introduction to the Bio-entity Recognition Task at {JNLPBA}",
author = "Collier, Nigel and
Kim, Jin-Dong",
booktitle = "Proceedings of the International Joint Workshop on Natural Language Processing in Biomedicine and its Applications ({NLPBA}/{B}io{NLP})",
month = aug # " 28th and 29th",
year = "2004",
address = "Geneva, Switzerland",
publisher = "COLING",
url = "https://aclanthology.org/W04-1213",
pages = "73--78",
}
### Contributions
Thanks to [@edugp](https://github.com/edugp) for adding this dataset. |
Davidckscjki/Test-Sample | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/82fe54a4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1337
dataset_size: 182
---
# Dataset Card for "82fe54a4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-banking77-default-c7e778-94421146088 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- banking77
eval_info:
task: multi_class_classification
model: thainq107/bert-base-banking77-pt2
metrics: []
dataset_name: banking77
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: thainq107/bert-base-banking77-pt2
* Dataset: banking77
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@cnxt](https://huggingface.co/cnxt) for evaluating this model. |
tyzhu/find_first_sent_train_10_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 47313
num_examples: 30
- name: validation
num_bytes: 15770
num_examples: 10
download_size: 0
dataset_size: 63083
---
# Dataset Card for "find_first_sent_train_10_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/sharegpt-hindi | ---
license: apache-2.0
---
Hindi ShareGPT data translated by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
amitness/logits-arabic-512 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 19256694548
num_examples: 1059535
download_size: 6841674965
dataset_size: 19256694548
---
# Dataset Card for "logits-arabic-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
simarora/ConcurrentQA-Retrieval | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
---
ConcurrentQA is a textual multi-hop QA benchmark to require concurrent retrieval over multiple data-distributions (i.e. Wikipedia and email data). This dataset was constructed by researchers at Stanford and FAIR, following the data collection process and schema of HotpotQA. This benchmark can be used to study generalization in retrieval as well as privacy when reasoning across multiple privacy scopes --- i.e. public Wikipedia documents and private emails.
This dataset is for the Retrieval task. The dataset for the Question-Answering task can be found here: https://huggingface.co/datasets/simarora/ConcurrentQA
The corpora of documents (Wikipedia and Emails) over which a system would need to retrieve information and answer questions can be downloaded using the following commands:
```
cd ..
mkdir corpora
cd corpora
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/enron_only_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/combined_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/wiki_only_corpus.json
wget https://dl.fbaipublicfiles.com/concurrentqa/corpora/title2sent_map.json
```
The repo https://github.com/facebookresearch/concurrentqa contains model training and result analysis code.
If you find this resource useful, consider citing the paper:
```
@article{arora2023reasoning,
title={Reasoning over Public and Private Data in Retrieval-Based Systems},
author={Simran Arora and Patrick Lewis and Angela Fan and Jacob Kahn and Christopher Ré},
year={2023},
url={https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00556/116046/Aggretriever-A-Simple-Approach-to-Aggregate},
journal={Transactions of the Association for Computational Linguistics},
}
```
Please reach out at ```simran@cs.stanford.edu``` with questions or feedback! |
php | ---
annotations_creators:
- found
language_creators:
- found
language:
- cs
- de
- en
- es
- fi
- fr
- he
- hu
- it
- ja
- ko
- nl
- pl
- pt
- ro
- ru
- sk
- sl
- sv
- tr
- tw
- zh
language_bcp47:
- pt-BR
- zh-TW
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: php
dataset_info:
- config_name: fi-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fi
- nl
splits:
- name: train
num_bytes: 1197502
num_examples: 27870
download_size: 43228
dataset_size: 1197502
- config_name: it-ro
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- it
- ro
splits:
- name: train
num_bytes: 1422966
num_examples: 28507
download_size: 108885
dataset_size: 1422966
- config_name: nl-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- nl
- sv
splits:
- name: train
num_bytes: 1298041
num_examples: 28079
download_size: 58495
dataset_size: 1298041
- config_name: en-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- it
splits:
- name: train
num_bytes: 2758463
num_examples: 35538
download_size: 478646
dataset_size: 2758463
- config_name: en-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 4288513
num_examples: 42222
download_size: 905396
dataset_size: 4288513
---
# Dataset Card for php
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://opus.nlpl.eu/PHP.php
- **Repository:** None
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
To load a language pair which isn't part of the config, all you need to do is specify the language code as pairs.
You can find the valid pairs in Homepage section of Dataset Description: http://opus.nlpl.eu/PHP.php
E.g.
`dataset = load_dataset("php", lang1="it", lang2="pl")`
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Here are some examples of questions and facts:
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
Sofoklis/hairpins-fasta-short | ---
dataset_info:
features:
- name: number
dtype: int64
- name: name
dtype: string
- name: sequence
dtype: string
- name: spaced_sequence
dtype: string
- name: array
sequence:
sequence: float64
- name: image
dtype: image
splits:
- name: train
num_bytes: 414420.3
num_examples: 90
- name: test
num_bytes: 46046.7
num_examples: 10
- name: valid
num_bytes: 82884.06
num_examples: 18
download_size: 117519
dataset_size: 543351.06
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
ricardosantoss/top12_com_relatorios_de_alta | ---
dataset_info:
features:
- name: Nota Clinica
dtype: string
- name: Sequencia_CID10_Lista
sequence: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1906716
num_examples: 1899
- name: test
num_bytes: 240160
num_examples: 238
- name: validation
num_bytes: 237032
num_examples: 237
download_size: 954473
dataset_size: 2383908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_DreadPoor__Satyr-7B-Model_Stock | ---
pretty_name: Evaluation run of DreadPoor/Satyr-7B-Model_Stock
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/Satyr-7B-Model_Stock](https://huggingface.co/DreadPoor/Satyr-7B-Model_Stock)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__Satyr-7B-Model_Stock\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T03:21:27.554118](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Satyr-7B-Model_Stock/blob/main/results_2024-04-06T03-21-27.554118.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538228091662829,\n\
\ \"acc_stderr\": 0.03204349764412167,\n \"acc_norm\": 0.6545608990736203,\n\
\ \"acc_norm_stderr\": 0.032693748107169206,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6376075140129857,\n\
\ \"mc2_stderr\": 0.015484912688455579\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6612627986348123,\n \"acc_stderr\": 0.01383056892797433,\n\
\ \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6976697868950408,\n\
\ \"acc_stderr\": 0.004583289072937751,\n \"acc_norm\": 0.8696474805815575,\n\
\ \"acc_norm_stderr\": 0.003360027661765394\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189936,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168592,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168592\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258165,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258165\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6376075140129857,\n\
\ \"mc2_stderr\": 0.015484912688455579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218317\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \
\ \"acc_stderr\": 0.01307993381180031\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/Satyr-7B-Model_Stock
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|arc:challenge|25_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|gsm8k|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hellaswag|10_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T03-21-27.554118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T03-21-27.554118.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- '**/details_harness|winogrande|5_2024-04-06T03-21-27.554118.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T03-21-27.554118.parquet'
- config_name: results
data_files:
- split: 2024_04_06T03_21_27.554118
path:
- results_2024-04-06T03-21-27.554118.parquet
- split: latest
path:
- results_2024-04-06T03-21-27.554118.parquet
---
# Dataset Card for Evaluation run of DreadPoor/Satyr-7B-Model_Stock
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/Satyr-7B-Model_Stock](https://huggingface.co/DreadPoor/Satyr-7B-Model_Stock) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__Satyr-7B-Model_Stock",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T03:21:27.554118](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Satyr-7B-Model_Stock/blob/main/results_2024-04-06T03-21-27.554118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538228091662829,
"acc_stderr": 0.03204349764412167,
"acc_norm": 0.6545608990736203,
"acc_norm_stderr": 0.032693748107169206,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6376075140129857,
"mc2_stderr": 0.015484912688455579
},
"harness|arc:challenge|25": {
"acc": 0.6612627986348123,
"acc_stderr": 0.01383056892797433,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726302
},
"harness|hellaswag|10": {
"acc": 0.6976697868950408,
"acc_stderr": 0.004583289072937751,
"acc_norm": 0.8696474805815575,
"acc_norm_stderr": 0.003360027661765394
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189936,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168592,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168592
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070813,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070813
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.6376075140129857,
"mc2_stderr": 0.015484912688455579
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218317
},
"harness|gsm8k|5": {
"acc": 0.6565579984836998,
"acc_stderr": 0.01307993381180031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.3 | ---
pretty_name: Evaluation run of MiniMoog/Mergerix-7b-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MiniMoog/Mergerix-7b-v0.3](https://huggingface.co/MiniMoog/Mergerix-7b-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T02:56:59.804461](https://huggingface.co/datasets/open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.3/blob/main/results_2024-04-03T02-56-59.804461.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511058828920021,\n\
\ \"acc_stderr\": 0.032038568937180344,\n \"acc_norm\": 0.6499991792728106,\n\
\ \"acc_norm_stderr\": 0.03271465925305467,\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7800529359705557,\n\
\ \"mc2_stderr\": 0.013691172247985002\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428175,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7162915753833897,\n\
\ \"acc_stderr\": 0.004498757194493397,\n \"acc_norm\": 0.8913563035251942,\n\
\ \"acc_norm_stderr\": 0.003105556631739391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047703,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806318,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7800529359705557,\n\
\ \"mc2_stderr\": 0.013691172247985002\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.01249392734865963\n }\n}\n```"
repo_url: https://huggingface.co/MiniMoog/Mergerix-7b-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-56-59.804461.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-56-59.804461.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- '**/details_harness|winogrande|5_2024-04-03T02-56-59.804461.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T02-56-59.804461.parquet'
- config_name: results
data_files:
- split: 2024_04_03T02_56_59.804461
path:
- results_2024-04-03T02-56-59.804461.parquet
- split: latest
path:
- results_2024-04-03T02-56-59.804461.parquet
---
# Dataset Card for Evaluation run of MiniMoog/Mergerix-7b-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MiniMoog/Mergerix-7b-v0.3](https://huggingface.co/MiniMoog/Mergerix-7b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T02:56:59.804461](https://huggingface.co/datasets/open-llm-leaderboard/details_MiniMoog__Mergerix-7b-v0.3/blob/main/results_2024-04-03T02-56-59.804461.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511058828920021,
"acc_stderr": 0.032038568937180344,
"acc_norm": 0.6499991792728106,
"acc_norm_stderr": 0.03271465925305467,
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7800529359705557,
"mc2_stderr": 0.013691172247985002
},
"harness|arc:challenge|25": {
"acc": 0.71160409556314,
"acc_stderr": 0.013238394422428175,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7162915753833897,
"acc_stderr": 0.004498757194493397,
"acc_norm": 0.8913563035251942,
"acc_norm_stderr": 0.003105556631739391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047703,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806318,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7800529359705557,
"mc2_stderr": 0.013691172247985002
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Orange/WikiFactDiff | ---
license: cc-by-sa-4.0
language:
- en
tags:
- Factual knowledge update
- General knowledge
- Wikidata
task_categories:
- other
size_categories:
- 100K<n<1M
configs:
- config_name: 20210104-20230227
default: true
data_files:
- split: train
path: "20210104-20230227/*.parquet"
- config_name: triple_verbs
data_files:
- split: train
path: "triple_verbs/*.parquet"
---
# WikiFactDiff: A Realistic Dataset for Atomic Factual Knowledge Update
WikiFactDiff is a dataset designed as a resource to perform realistic factual updates within language models and to evaluate them post-update.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
WikiFactDiff is a dataset that describes the factual changes between two dates as a collection of simple facts divided into three categories: **new**, **obsolete**, and **static**. The facts are represented by subject-relation-object triples. WikiFactDiff is constructed by comparing the state of the Wikidata knowledge base at two dates $T_{old}$ and $T_{new}$.
Those fact are accompanied by verbalization templates and cloze tests that enable running update algorithms and their evaluation.
Contrary to other datasets, such as zsRE and CounterFact, WikiFactDiff constitutes a realistic update setting that involves various update scenarios, including replacements, archival, and new entity insertions.
WikiFactDiff sample (triples only) | Templates used for verbalization
:-------------------------:|:-------------------------:
[<img src="readme_images/sample.png" width="500"/>](./images/sample.png) | [<img src="readme_images/verb.png" width="500"/>](./images/verb.png)
We are releasing here the WikiFactDiff dataset for January 4, 2021 and February 27, 2023, which is ideal for updating language models trained using the Pile dataset released on December 31, 2020.
**Note:** Future releases, to fit other models for instance, will be stored here as different configurations of WikiFactDiff.
### Dataset Features
- **Language(s) (NLP):** English
- **License:** This work is licensed via CC BY-SA 4.0
### External resources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub](https://github.com/Orange-OpenSource/WikiFactDiff) (To possibly rebuild the dataset with different $T_{old}$ and $T_{new}$)
- **Paper:** [Link](https://arxiv.org/abs/2403.14364)
## Uses
<!-- This section describes suitable use cases for the dataset. -->
- Align language models with current factual knowledge
- Evaluate knowledge update algorithms on realistic updates:
- *Replacement-only* algorithms, e.g., ROME, MEMIT, MEND, etc.
- General algorithms that can handle any update that can arise from the semantic triple representation of facts *(s,r,o)*.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
WikiFactDiff contains a list of updates. Here are the fields of each element of the list:
- **"subject"** (dict)
- **"id"** : Subject Wikidata ID (string)
- **"label"** : Subject Wikidata label (string)
- **"description"** : Subject Wikidata description (string)
- **"subject_is_ph_new"** : The subject is a new entity, i.e. an entity that did not exist at $T_{old}$ but exists at $T_{new}$. (bool)
- **"subject_popularity"** : A measure of the subject's popularity. (float)
- **"relation"** (dict)
- **"id"** : Relation Wikidata ID (string)
- **"label"** : Relation Wikidata label (string)
- **"description"** : Relation Wikidata description (string)
that did not exist at $T_{old}$ (bool)
- **"relation_is_temp_func"** : The relation is temporal functional
- **"is_replace"** : The update represents a replacement. For instance, replacing the prime minister of UK. (bool)
- **"objects"** (list): each *dict* in the list contains the fields:
- **"id"** : Object Wikidata ID or None if it's a literal (string)
- **"label"** : Object Wikidata label (string)
- **"description"** : Object Wikidata description (string)
- **"decision"** : It can take three values (*new, obsolete, static*) depending on the veracity of the object. For example, in (Donald Trump, head of state, USA), USA recieves the label *obsolete* (suppose $T_{old}=2022$ and $T_{new}=2024$ for instance). (string)
- **"update_prompt"** (string): The cloze test that is fed to the update algorithm with model to perform the update.
- **"generalization_prompts"** : The cloze tests used to evaluate the generalization of the update to paraphrases.
- **"neighborhood"** (list): The list of neighbor groups (facts) to assess potential bleedover. The neighborhood's relation is the same as the one in the update. Each *dict* in the list contains the fields:
- **"subject"** (dict):
- **"id"** : Neighbor subject Wikidata ID (string)
- **"label"** : Neighbor subject Wikidata label (string)
- **"description"** : Neighbor subject Wikidata description (string)
- **"dist"** : Distance between the two entities : *neighborhood.subject* and the current *subject*. (float)
- **"objects"** (list): each *dict* in the list contains the fields:
- **"id"** : Object Wikidata ID or None if it's a literal (string)
- **"label"** : Object Wikidata label (string)
- **"description"** : Object Wikidata description (string)
- **"prompt"**: The cloze test used to validate the knowledge of this neighbor triple by the LM. For instance, "The head of state of France is ____". (string)
A more detailed description of the concepts above are included in our paper including: the measure of an entity's popularity, the method to construct the neighborhood of a fact and the meaning of temporal functional relations.
## Dataset Creation
#### Source Data
- The facts in triple format were collected from Wikidata.
- The templates to verbalize these triples in English were created using post-processed ChatGPT verbalizations.
#### Data Collection and Processing
1. Two instances of Wikidata are collected at $T_{old}$ and $T_{new}$ respectively.
2. These instances are preprocessed to filter irrelevant data and compared to get the difference between them.
3. Each relevant triple in this difference is labeled with *new, static* or *obsolete*.
4. These triples are verbalized and and a set of neighbor facts is collected for each triple.
<center><br><b>Build process</b></br><img src="readme_images/build_process.png" width="350"/></center>
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@misc{khodja2024wikifactdiff,
title={WikiFactDiff: A Large, Realistic, and Temporally Adaptable Dataset for Atomic Factual Knowledge Update in Causal Language Models},
author={Hichem Ammar Khodja and Frédéric Béchet and Quentin Brabant and Alexis Nasr and Gwénolé Lecorvé},
year={2024},
eprint={2403.14364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
**APA:**
Khodja, H.A., Béchet, F., Brabant, Q., Nasr, A., & Lecorvé, G. (2024). WikiFactDiff: A Large, Realistic, and Temporally Adaptable Dataset for Atomic Factual Knowledge Update in Causal Language Models.
|
Alexator26/1839_with_messy_bg | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 1301405031.25
num_examples: 1839
download_size: 1301437833
dataset_size: 1301405031.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jahjinx/IMDb_movie_reviews | ---
pretty_name: IMDb
task_categories:
- text-classification
task_ids:
- sentiment-classification
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
---
# Dataset Card for IMDb Movie Reviews
## Dataset Description
- **Homepage:** [http://ai.stanford.edu/~amaas/data/sentiment/](http://ai.stanford.edu/~amaas/data/sentiment/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Total amount of disk used:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
This is a custom train/test/validation split of the IMDb Large Movie Review Dataset available from [http://ai.stanford.edu/~amaas/data/sentiment/](http://ai.stanford.edu/~amaas/data/sentiment/).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
#### IMDb_movie_reviews
An example of 'train':
```
{
"text": "Beautifully photographed and ably acted, generally, but the writing is very slipshod. There are scenes of such unbelievability that there is no joy in the watching. The fact that the young lover has a twin brother, for instance, is so contrived that I groaned out loud. And the "emotion-light bulb connection" seems gimmicky, too.<br /><br />I don\'t know, though. If you have a few glasses of wine and feel like relaxing with something pretty to look at with a few flaccid comedic scenes, this is a pretty good movie. No major effort on the part of the viewer required. But Italian film, especially Italian comedy, is usually much, much better than this."
"label": 0,
}
```
### Data Fields
The data fields are the same among all splits.
#### IMDb_movie_reviews
- `text`: a `string` feature.
- `label`: a classification label, with values `neg` (0), `pos` (1).
### Data Splits
| name | train | validation | test |
|------------------|------:|-----------:|------:|
|IMDb_movie_reviews| 36000 | 4000 | 10000 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@InProceedings{maas-EtAl:2011:ACL-HLT2011,
author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher},
title = {Learning Word Vectors for Sentiment Analysis},
booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies},
month = {June},
year = {2011},
address = {Portland, Oregon, USA},
publisher = {Association for Computational Linguistics},
pages = {142--150},
url = {http://www.aclweb.org/anthology/P11-1015}
}
```
### Contributions
[More Information Needed] |
munawwarsultan2017/US_Presidential_Election_2020_Dem_Rep | ---
license: mit
---
|
Akatsuki-Amemiya/Akatsuki_Cantonese_Singing | ---
license: other
language:
- zh
tags:
- music
size_categories:
- 100B<n<1T
---
## Akatsuki 的粤语歌声数据集 ##
----
使用前请查看[License](https://huggingface.co/datasets/Akatsuki-Amemiya/Akatsuki_Cantonese_Singing#license)
进行申请后请发送邮件到1262917464@qq.com,以便人工审核通过。
我知道申请时HF会给我发送邮箱,但是我会忽视掉它
After submitting the application, please send an email to 1262917464@qq.com for manual review and approval.
Only emails from HF will be ignored.
申請を行った後、1262917464@qq.comにメールを送信して、手動で審査と承認を行ってください。
HFからのメールのみ無視されます。
----
### License ###
----
#### 中文 ####
该数据集在使用前,需严格遵守以下条款。若您不同意这些条款,请勿使用该数据集。
1.权利授权
本数据集拥有者(以下简称“作者”)授予您非排他性、不可转让、不可分许可使用本数据集,以及使用本数据集产生的所有成果,包括商业和非商业目的。
但是,无论是否为商业用途,您必须注明数据集来源及作者,以允许其他人获得使用权限。
2.共享回报
所有使用该数据集产生的公开成果(包括发表的论文、研究报告、软件、算法等),必须无偿为该数据集作者共享完整本地实际操作流程,以便数据集作者可以在本地实际复现公开成果。
3.商业使用
如您打算使用该数据集进行商业活动,您必须提前告知数据集作者,并获得数据集作者的书面同意。商业使用包括但不限于出售数据集或使用数据集进行产品研发等。
4.使用限制
禁止从数据集猜测出数据集提供者中之人现实身份,也不允许使用该数据集产出任何宣传任何政治意识形态的作品。如有违反,数据集作者有权采取法律措施。
5.免责声明
该数据集是在其提供的现状(“AS IS”)下提供的,作者不对该数据集及使用该数据集产生的成果的质量、适用性和可靠性做出任何明示或暗示的保证。
----
#### English ####
This translation is provided by ChatGPT. In case of any discrepancy with the Chinese version, the Chinese version shall prevail.
Before using this dataset, you must strictly abide by the following terms. If you do not agree to these terms, do not use this dataset.
1. Rights Authorization
The owner of this dataset (hereinafter referred to as "the author") grants you a non-exclusive, non-transferable, and non-divisible license to use this dataset and all results generated by using this dataset for commercial and non-commercial purposes.
However, regardless of whether it is a commercial use, you must indicate the source and author of the dataset, to allow others to obtain usage rights.
2. Sharing Returns
All public results generated by using this dataset (including published papers, research reports, software, algorithms, etc.) must be fully shared with the dataset author at no charge, so that the dataset author can reproduce public results locally.
3. Commercial Use
If you intend to use this dataset for commercial activities, you must inform the dataset author in advance and obtain the written consent of the dataset author. Commercial use includes but is not limited to selling the dataset or using the dataset for product development.
4. Usage Restrictions
Guessing the real identity of the data providers from the dataset is prohibited, and it is also not allowed to produce any works promoting any political ideology using this dataset. If there is any violation, the dataset author has the right to take legal measures.
5. Disclaimer
This dataset is provided as-is, and the author makes no express or implied warranties as to the quality, applicability, and reliability of this dataset and the results generated by using this dataset.
----
#### 日本語 ####
この翻訳はChatGPTによって提供されたものであり、中国語版と相違がある場合は中国語版が優先されます。
このデータセットを使用する前に、以下の条件に厳密に従う必要があります。これらの条件に同意しない場合は、このデータセットを使用しないでください。
1. 権利の承認
このデータセットの所有者(以下、「著者」とします)は、商業および非商業目的を含む、このデータセットとこのデータセットを使用して生成されたすべての成果に対して、排他的で譲渡不可および不可分割なライセンスをあなたに付与します。
ただし、商業利用であっても、データセットの出典と著者を示す必要があり、他の人が使用権を取得できるようにする必要があります。
2. 分かち合いのリターン
このデータセットを使用して生成されたすべての公開成果物(出版された論文、研究報告、ソフトウェア、アルゴリズムなど)は、データセットの著者に対して無償で完全共有する必要がありますので、データセットの著者は地元で公開成果物を再現できます。
3. 商業利用
このデータセットを商業活動に使用する場合は、事前にデータセットの著者に通知し、データセットの著者の書面による同意を得る必要があります。商業利用には、データセットの販売や製品開発に使用することなどが含まれます。
4. 使用制限
データセットからデータ提供者の実際の身元を推測することは禁止されており、このデータセットを使用して、いかなる政治的イデオロギーを促進する作品を製作することもできません。違反した場合、データセットの著者は法的手段を取る権利があります。
5. 免責事項
データセットは「現状有姿」で提供されるものであり、作者は、このデータセットおよびこのデータセットを使用して生成された成果物の品質、適用性、信頼性について、明示的または黙示的な保証を提供しません。 |
Seanxh/twitter_dataset_1713091512 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24185
num_examples: 59
download_size: 13915
dataset_size: 24185
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xjs521/instruct_llm | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 104457592
num_examples: 26000
download_size: 55429667
dataset_size: 104457592
---
|
JCTN/ReActor | ---
license: mit
viewer: false
---
ReActor Assets
=================
The Fast and Simple Face Swap Extension
[sd-webui-reactor](https://github.com/Gourieff/sd-webui-reactor) <br>
[comfyui-reactor-node](https://github.com/Gourieff/comfyui-reactor-node)
[comfyui-reactor-node](https://huggingface.co/datasets/Gourieff/ReActor)
Models
------
| file | source | license |
|---------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------|-------------------------------------------------------------------------|
| [buffalo_l.zip](https://huggingface.co/datasets/Gourieff/ReActor/blob/main/models/buffalo_l.zip) | [DeepInsight](https://github.com/deepinsight/insightface) |  |
| [codeformer-v0.1.0.pth](https://huggingface.co/datasets/Gourieff/ReActor/blob/main/models/facerestore_models/codeformer-v0.1.0.pth) | [sczhou](https://github.com/sczhou/CodeFormer) |  |
| [GFPGANv1.3.pth](https://huggingface.co/datasets/Gourieff/ReActor/blob/main/models/facerestore_models/GFPGANv1.3.pth) | [TencentARC](https://github.com/TencentARC/GFPGAN) |  |
| [GFPGANv1.4.pth](https://huggingface.co/datasets/Gourieff/ReActor/blob/main/models/facerestore_models/GFPGANv1.4.pth) | [TencentARC](https://github.com/TencentARC/GFPGAN) |  |
| [inswapper_128.onnx](https://github.com/facefusion/facefusion-assets/releases/download/models/inswapper_128.onnx) | [DeepInsight](https://github.com/deepinsight/insightface) |  |
| [inswapper_128_fp16.onnx](https://github.com/facefusion/facefusion-assets/releases/download/models/inswapper_128_fp16.onnx) | [Hillobar](https://github.com/Hillobar/Rope) |  |
|
xaviviro/oasst1_ca_gpt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 490388
num_examples: 517
- name: train
num_bytes: 9262378
num_examples: 9841
download_size: 4979029
dataset_size: 9752766
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
language:
- ca
--- |
CyberHarem/jeanne_d_arc_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jeanne_d_arc (Granblue Fantasy)
This is the dataset of jeanne_d_arc (Granblue Fantasy), containing 314 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hair_ornament, breasts, blue_eyes, hair_flower, hairband, large_breasts, bangs, hair_intakes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 314 | 432.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 314 | 261.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 759 | 546.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 314 | 391.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 759 | 739.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jeanne_d_arc_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, flower, looking_at_viewer, solo, white_dress, ahoge, detached_sleeves, blush, thighs, hair_between_eyes, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, solo, white_dress, bare_shoulders, flower, gauntlets, looking_at_viewer, thighhighs, ahoge, flag, greaves, thigh_boots, armored_boots, blush, sword |
| 2 | 14 |  |  |  |  |  | 1girl, solo, thighhighs, medium_breasts, cleavage, gauntlets, holding_sword, bare_shoulders, flag, looking_at_viewer, very_long_hair, lily_(flower), armored_dress, collarbone, thigh_boots |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, very_long_hair, white_dress, ahoge, blush, thigh_boots, thighhighs, detached_sleeves, flower, hair_between_eyes, sitting, smile |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, official_alternate_costume, purple_bikini, solo, collarbone, simple_background, smile, lily_(flower), upper_body, white_background, blush, parted_lips |
| 5 | 13 |  |  |  |  |  | 1girl, cleavage, flower, looking_at_viewer, official_alternate_costume, purple_bikini, solo, bare_shoulders, blush, collarbone, navel, side-tie_bikini_bottom, diadem, simple_background, front-tie_bikini_top, white_background, parted_lips, ponytail, smile, see-through |
| 6 | 5 |  |  |  |  |  | 1girl, armpits, arms_behind_head, arms_up, cleavage, flower, looking_at_viewer, navel, official_alternate_costume, purple_bikini, solo, blush, smile, bare_shoulders, collarbone, diadem, front-tie_bikini_top, mouth_hold, purple_eyes, side-tie_bikini_bottom |
| 7 | 24 |  |  |  |  |  | 1girl, cleavage, day, official_alternate_costume, purple_bikini, flower, looking_at_viewer, outdoors, solo, blush, navel, ocean, beach, collarbone, bare_shoulders, blue_sky, side-tie_bikini_bottom, cloud, smile, front-tie_bikini_top, hair_between_eyes, armpits, diadem |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | flower | looking_at_viewer | solo | white_dress | ahoge | detached_sleeves | blush | thighs | hair_between_eyes | white_background | gauntlets | thighhighs | flag | greaves | thigh_boots | armored_boots | sword | medium_breasts | cleavage | holding_sword | very_long_hair | lily_(flower) | armored_dress | collarbone | sitting | smile | official_alternate_costume | purple_bikini | simple_background | upper_body | parted_lips | navel | side-tie_bikini_bottom | diadem | front-tie_bikini_top | ponytail | see-through | armpits | arms_behind_head | arms_up | mouth_hold | purple_eyes | day | outdoors | ocean | beach | blue_sky | cloud |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------|:--------------------|:-------|:--------------|:--------|:-------------------|:--------|:---------|:--------------------|:-------------------|:------------|:-------------|:-------|:----------|:--------------|:----------------|:--------|:-----------------|:-----------|:----------------|:-----------------|:----------------|:----------------|:-------------|:----------|:--------|:-----------------------------|:----------------|:--------------------|:-------------|:--------------|:--------|:-------------------------|:---------|:-----------------------|:-----------|:--------------|:----------|:-------------------|:----------|:-------------|:--------------|:------|:-----------|:--------|:--------|:-----------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | | X | X | | | | | | | | X | X | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | | X | | | X | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | | | | X | | | X | | | | | | | | | X | | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | X | X | X | X | | | | X | | | X | | | | | | | | | X | | | | | X | | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | | | | | | | | | X | | | | | X | | X | X | X | | | | X | X | X | X | | | X | X | X | X | X | | | | | | |
| 7 | 24 |  |  |  |  |  | X | X | X | X | X | | | | X | | X | | | | | | | | | | X | | | | | X | | X | X | X | | | | X | X | X | X | | | X | | | | | X | X | X | X | X | X |
|
Nexusflow/VirusTotalBenchmark | ---
dataset_info:
features:
- name: Input
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 36883
num_examples: 151
download_size: 15657
dataset_size: 36883
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-world_religions-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 24737
num_examples: 171
download_size: 18201
dataset_size: 24737
---
# Dataset Card for "mmlu-world_religions-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/TinyImagenet_2k_validation | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': goldfish
'1': fire salamander
'2': American bullfrog
'3': tailed frog
'4': American alligator
'5': boa constrictor
'6': trilobite
'7': scorpion
'8': southern black widow
'9': tarantula
'10': centipede
'11': koala
'12': jellyfish
'13': brain coral
'14': snail
'15': sea slug
'16': American lobster
'17': spiny lobster
'18': black stork
'19': king penguin
'20': albatross
'21': dugong
'22': Yorkshire Terrier
'23': Golden Retriever
'24': Labrador Retriever
'25': German Shepherd Dog
'26': Standard Poodle
'27': tabby cat
'28': Persian cat
'29': Egyptian Mau
'30': cougar
'31': lion
'32': brown bear
'33': ladybug
'34': grasshopper
'35': stick insect
'36': cockroach
'37': praying mantis
'38': dragonfly
'39': monarch butterfly
'40': sulphur butterfly
'41': sea cucumber
'42': guinea pig
'43': pig
'44': ox
'45': bison
'46': bighorn sheep
'47': gazelle
'48': arabian camel
'49': orangutan
'50': chimpanzee
'51': baboon
'52': African bush elephant
'53': red panda
'54': abacus
'55': academic gown
'56': altar
'57': backpack
'58': baluster / handrail
'59': barbershop
'60': barn
'61': barrel
'62': basketball
'63': bathtub
'64': station wagon
'65': lighthouse
'66': beaker
'67': beer bottle
'68': bikini
'69': binoculars
'70': birdhouse
'71': bow tie
'72': brass memorial plaque
'73': bucket
'74': high-speed train
'75': butcher shop
'76': candle
'77': cannon
'78': cardigan
'79': automated teller machine
'80': CD player
'81': storage chest
'82': Christmas stocking
'83': cliff dwelling
'84': computer keyboard
'85': candy store
'86': convertible
'87': crane bird
'88': dam
'89': desk
'90': dining table
'91': dumbbell
'92': flagpole
'93': fly
'94': fountain
'95': freight car
'96': frying pan
'97': fur coat
'98': gas mask or respirator
'99': go-kart
'100': gondola
'101': hourglass
'102': iPod
'103': rickshaw
'104': kimono
'105': lampshade
'106': lawn mower
'107': lifeboat
'108': limousine
'109': magnetic compass
'110': maypole
'111': military uniform
'112': miniskirt
'113': moving van
'114': neck brace
'115': obelisk
'116': oboe
'117': pipe organ
'118': parking meter
'119': payphone
'120': picket fence
'121': pill bottle
'122': plunger
'123': police van
'124': poncho
'125': soda bottle
'126': potter's wheel
'127': missile
'128': punching bag
'129': refrigerator
'130': remote control
'131': rocking chair
'132': rugby ball
'133': sandal
'134': school bus
'135': scoreboard
'136': sewing machine
'137': snorkel
'138': sock
'139': sombrero
'140': space heater
'141': spider web
'142': sports car
'143': through arch bridge
'144': stopwatch
'145': sunglasses
'146': suspension bridge
'147': swim trunks / shorts
'148': syringe
'149': teapot
'150': teddy bear
'151': thatched roof
'152': torch
'153': tractor
'154': triumphal arch
'155': trolleybus
'156': turnstile
'157': umbrella
'158': vestment
'159': viaduct
'160': volleyball
'161': water jug
'162': water tower
'163': wok
'164': wooden spoon
'165': comic book
'166': fishing casting reel
'167': guacamole
'168': ice cream
'169': popsicle
'170': goose
'171': drumstick
'172': plate
'173': pretzel
'174': mashed potatoes
'175': cauliflower
'176': bell pepper
'177': lemon
'178': banana
'179': pomegranate
'180': meatloaf
'181': pizza
'182': pot pie
'183': espresso
'184': bee
'185': apron
'186': pole
'187': Chihuahua
'188': mountain
'189': cliff
'190': coral reef
'191': lakeshore
'192': beach
'193': acorn
'194': broom
'195': mushroom
'196': metal nail
'197': chain
'198': slug
'199': orange
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: id
dtype: int64
splits:
- name: validation
num_bytes: 5104453.0
num_examples: 2000
download_size: 3249857
dataset_size: 5104453.0
---
# Dataset Card for "TinyImagenet_2k_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713120858 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 32629
num_examples: 78
download_size: 16187
dataset_size: 32629
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
avalab/Mforms_TripAdvisor | ---
dataset_info:
features:
- name: utterance
dtype: string
- name: slot_0
dtype: string
- name: semantic_map
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 114805
num_examples: 803
download_size: 0
dataset_size: 114805
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mforms_TripAdvisor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allen0523/robot300 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 240903241.0
num_examples: 300
download_size: 240917130
dataset_size: 240903241.0
---
# Dataset Card for "robot300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kaede_ikeno_sakuratrick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kaede Ikeno
This is the dataset of Kaede Ikeno, containing 150 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 150 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 348 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 383 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 150 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 150 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 150 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 348 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 348 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 301 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 383 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 383 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
gauravkaul/RedCaps | ---
license: cc-by-4.0
---
|
jxie/trivia_qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
sequence: string
splits:
- name: train
num_bytes: 24322980
num_examples: 61888
- name: test
num_bytes: 3213880
num_examples: 7993
download_size: 15962297
dataset_size: 27536860
---
# Dataset Card for "trivia_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sumuks/scientific_papers | ---
license: mit
---
|
hoangho/dataset | ---
license: mit
---
|
dhuck/faust_code | ---
dataset_info:
features:
- name: _id
dtype: string
- name: repository
dtype: string
- name: name
dtype: string
- name: content
dtype: string
- name: download_url
dtype: string
- name: language
dtype: string
- name: comments
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 28372010
num_examples: 4222
download_size: 10561733
dataset_size: 28372010
---
# Dataset Card for "faust_code"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
medarc/mmlu_professional_medicine | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: training
num_bytes: 28530
num_examples: 36
- name: test
num_bytes: 224349
num_examples: 272
download_size: 146822
dataset_size: 252879
configs:
- config_name: default
data_files:
- split: training
path: data/training-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16 | ---
pretty_name: Evaluation run of TheBloke/robin-65b-v2-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/robin-65b-v2-fp16](https://huggingface.co/TheBloke/robin-65b-v2-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T10:30:00.008059](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16/blob/main/results_2023-10-23T10-30-00.008059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n\
\ \"em_stderr\": 0.00048005108166193297,\n \"f1\": 0.064190436241611,\n\
\ \"f1_stderr\": 0.001385342539630455,\n \"acc\": 0.5374763713870437,\n\
\ \"acc_stderr\": 0.011680771136203586\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002202181208053691,\n \"em_stderr\": 0.00048005108166193297,\n\
\ \"f1\": 0.064190436241611,\n \"f1_stderr\": 0.001385342539630455\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2699014404852161,\n \
\ \"acc_stderr\": 0.012227442856468897\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938275\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/robin-65b-v2-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T10_30_00.008059
path:
- '**/details_harness|drop|3_2023-10-23T10-30-00.008059.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T10-30-00.008059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T10_30_00.008059
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-30-00.008059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-30-00.008059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T10_30_00.008059
path:
- '**/details_harness|winogrande|5_2023-10-23T10-30-00.008059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T10-30-00.008059.parquet'
- config_name: results
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- results_2023-08-17T22:09:59.169977.parquet
- split: 2023_10_23T10_30_00.008059
path:
- results_2023-10-23T10-30-00.008059.parquet
- split: latest
path:
- results_2023-10-23T10-30-00.008059.parquet
---
# Dataset Card for Evaluation run of TheBloke/robin-65b-v2-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/robin-65b-v2-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/robin-65b-v2-fp16](https://huggingface.co/TheBloke/robin-65b-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T10:30:00.008059](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16/blob/main/results_2023-10-23T10-30-00.008059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166193297,
"f1": 0.064190436241611,
"f1_stderr": 0.001385342539630455,
"acc": 0.5374763713870437,
"acc_stderr": 0.011680771136203586
},
"harness|drop|3": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166193297,
"f1": 0.064190436241611,
"f1_stderr": 0.001385342539630455
},
"harness|gsm8k|5": {
"acc": 0.2699014404852161,
"acc_stderr": 0.012227442856468897
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938275
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sidmanale643/med_gennie | ---
license: other
---
|
reciprocate/tinygsm_mixtral_12M | ---
dataset_info:
features:
- name: question
dtype: string
- name: program
dtype: string
- name: result
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 16089878144
num_examples: 12000000
download_size: 4759852649
dataset_size: 16089878144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ummagumm-a/cup-it-ds-classification-pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
struct:
- name: score
dtype: int64
- name: text
dtype: string
- name: rejected
struct:
- name: score
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 341356800
num_examples: 281940
download_size: 196778839
dataset_size: 341356800
---
# Dataset Card for "cup-it-ds-classification-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hereral/Clara-Training-Data | ---
license: apache-2.0
---
|
yuvalkirstain/task_prediction_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: path
dtype: string
- name: text
dtype: string
- name: task_name
dtype: string
splits:
- name: train
num_bytes: 659890949
num_examples: 5663600
- name: validation
num_bytes: 7823929
num_examples: 60002
download_size: 0
dataset_size: 667714878
---
# Dataset Card for "task_prediction_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xwjiang2010/pile_dedupe_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 88191974579
num_examples: 15000000
download_size: 20794320583
dataset_size: 88191974579
---
# Dataset Card for "pile_dedupe_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
debosneed/manuscript-captions | ---
license: afl-3.0
---
|
yzhuang/autotree_automl_Higgs_gosdt_l512_d3_sd1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 12501600000
num_examples: 100000
- name: validation
num_bytes: 1250160000
num_examples: 10000
download_size: 9801806108
dataset_size: 13751760000
---
# Dataset Card for "autotree_automl_Higgs_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indic_glue | ---
annotations_creators:
- other
language_creators:
- found
language:
- as
- bn
- en
- gu
- hi
- kn
- ml
- mr
- or
- pa
- ta
- te
license:
- other
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|other
task_categories:
- text-classification
- token-classification
- multiple-choice
task_ids:
- topic-classification
- natural-language-inference
- sentiment-analysis
- semantic-similarity-scoring
- named-entity-recognition
- multiple-choice-qa
pretty_name: IndicGLUE
tags:
- discourse-mode-classification
- paraphrase-identification
- cross-lingual-similarity
- headline-classification
dataset_info:
- config_name: actsa-sc.te
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': positive
'1': negative
splits:
- name: train
num_bytes: 1370907
num_examples: 4328
- name: validation
num_bytes: 166089
num_examples: 541
- name: test
num_bytes: 168291
num_examples: 541
download_size: 727630
dataset_size: 1705287
- config_name: bbca.hi
features:
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 22126205
num_examples: 3467
- name: test
num_bytes: 5501148
num_examples: 866
download_size: 10349015
dataset_size: 27627353
- config_name: copa.en
features:
- name: premise
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: question
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 46033
num_examples: 400
- name: validation
num_bytes: 11679
num_examples: 100
- name: test
num_bytes: 55846
num_examples: 500
download_size: 79431
dataset_size: 113558
- config_name: copa.gu
features:
- name: premise
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: question
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 92097
num_examples: 362
- name: validation
num_bytes: 23450
num_examples: 88
- name: test
num_bytes: 109997
num_examples: 448
download_size: 107668
dataset_size: 225544
- config_name: copa.hi
features:
- name: premise
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: question
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 93376
num_examples: 362
- name: validation
num_bytes: 23559
num_examples: 88
- name: test
num_bytes: 112830
num_examples: 449
download_size: 104233
dataset_size: 229765
- config_name: copa.mr
features:
- name: premise
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: question
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 93441
num_examples: 362
- name: validation
num_bytes: 23874
num_examples: 88
- name: test
num_bytes: 112055
num_examples: 449
download_size: 105962
dataset_size: 229370
- config_name: csqa.as
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 3800523
num_examples: 2942
download_size: 1390423
dataset_size: 3800523
- config_name: csqa.bn
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 54671018
num_examples: 38845
download_size: 19648180
dataset_size: 54671018
- config_name: csqa.gu
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 29131607
num_examples: 22861
download_size: 6027825
dataset_size: 29131607
- config_name: csqa.hi
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 40409347
num_examples: 35140
download_size: 14711258
dataset_size: 40409347
- config_name: csqa.kn
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 21199816
num_examples: 13666
download_size: 7669655
dataset_size: 21199816
- config_name: csqa.ml
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 47220836
num_examples: 26537
download_size: 17382215
dataset_size: 47220836
- config_name: csqa.mr
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 13667174
num_examples: 11370
download_size: 5072738
dataset_size: 13667174
- config_name: csqa.or
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 2562365
num_examples: 1975
download_size: 948046
dataset_size: 2562365
- config_name: csqa.pa
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 5806097
num_examples: 5667
download_size: 2194109
dataset_size: 5806097
- config_name: csqa.ta
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 61868481
num_examples: 38590
download_size: 20789467
dataset_size: 61868481
- config_name: csqa.te
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: options
sequence: string
- name: out_of_context_options
sequence: string
splits:
- name: test
num_bytes: 58784997
num_examples: 41338
download_size: 17447618
dataset_size: 58784997
- config_name: cvit-mkb-clsr.en-bn
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 1990957
num_examples: 5522
download_size: 945551
dataset_size: 1990957
- config_name: cvit-mkb-clsr.en-gu
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 2303377
num_examples: 6463
download_size: 1093313
dataset_size: 2303377
- config_name: cvit-mkb-clsr.en-hi
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 1855989
num_examples: 5169
download_size: 890609
dataset_size: 1855989
- config_name: cvit-mkb-clsr.en-ml
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 1990089
num_examples: 4886
download_size: 868956
dataset_size: 1990089
- config_name: cvit-mkb-clsr.en-mr
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 2130601
num_examples: 5760
download_size: 993961
dataset_size: 2130601
- config_name: cvit-mkb-clsr.en-or
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 274873
num_examples: 752
download_size: 134334
dataset_size: 274873
- config_name: cvit-mkb-clsr.en-ta
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 2565178
num_examples: 5637
download_size: 1091653
dataset_size: 2565178
- config_name: cvit-mkb-clsr.en-te
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 1771129
num_examples: 5049
download_size: 840410
dataset_size: 1771129
- config_name: cvit-mkb-clsr.en-ur
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: test
num_bytes: 288430
num_examples: 1006
download_size: 166129
dataset_size: 288430
- config_name: iitp-mr.hi
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 6704905
num_examples: 2480
- name: validation
num_bytes: 822218
num_examples: 310
- name: test
num_bytes: 702373
num_examples: 310
download_size: 3151762
dataset_size: 8229496
- config_name: iitp-pr.hi
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 945589
num_examples: 4182
- name: validation
num_bytes: 120100
num_examples: 523
- name: test
num_bytes: 121910
num_examples: 523
download_size: 509822
dataset_size: 1187599
- config_name: inltkh.gu
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': entertainment
'1': business
'2': tech
'3': sports
'4': state
'5': spirituality
'6': tamil-cinema
'7': positive
'8': negative
'9': neutral
splits:
- name: train
num_bytes: 883063
num_examples: 5269
- name: validation
num_bytes: 111201
num_examples: 659
- name: test
num_bytes: 110757
num_examples: 659
download_size: 515094
dataset_size: 1105021
- config_name: inltkh.ml
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': entertainment
'1': business
'2': tech
'3': sports
'4': state
'5': spirituality
'6': tamil-cinema
'7': positive
'8': negative
'9': neutral
splits:
- name: train
num_bytes: 1108145
num_examples: 5036
- name: validation
num_bytes: 140055
num_examples: 630
- name: test
num_bytes: 138847
num_examples: 630
download_size: 571019
dataset_size: 1387047
- config_name: inltkh.mr
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': entertainment
'1': business
'2': tech
'3': sports
'4': state
'5': spirituality
'6': tamil-cinema
'7': positive
'8': negative
'9': neutral
splits:
- name: train
num_bytes: 1462614
num_examples: 9672
- name: validation
num_bytes: 180306
num_examples: 1210
- name: test
num_bytes: 180558
num_examples: 1210
download_size: 840304
dataset_size: 1823478
- config_name: inltkh.ta
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': entertainment
'1': business
'2': tech
'3': sports
'4': state
'5': spirituality
'6': tamil-cinema
'7': positive
'8': negative
'9': neutral
splits:
- name: train
num_bytes: 2659569
num_examples: 5346
- name: validation
num_bytes: 316083
num_examples: 669
- name: test
num_bytes: 320465
num_examples: 669
download_size: 1271262
dataset_size: 3296117
- config_name: inltkh.te
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': entertainment
'1': business
'2': tech
'3': sports
'4': state
'5': spirituality
'6': tamil-cinema
'7': positive
'8': negative
'9': neutral
splits:
- name: train
num_bytes: 1361667
num_examples: 4328
- name: validation
num_bytes: 170471
num_examples: 541
- name: test
num_bytes: 173149
num_examples: 541
download_size: 726293
dataset_size: 1705287
- config_name: md.hi
features:
- name: sentence
dtype: string
- name: discourse_mode
dtype: string
- name: story_number
dtype: int32
- name: id
dtype: int32
splits:
- name: train
num_bytes: 1672109
num_examples: 7974
- name: validation
num_bytes: 211187
num_examples: 997
- name: test
num_bytes: 210175
num_examples: 997
download_size: 939801
dataset_size: 2093471
- config_name: sna.bn
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': kolkata
'1': state
'2': national
'3': sports
'4': entertainment
'5': international
splits:
- name: train
num_bytes: 46070046
num_examples: 11284
- name: validation
num_bytes: 5648126
num_examples: 1411
- name: test
num_bytes: 5799979
num_examples: 1411
download_size: 21415940
dataset_size: 57518151
- config_name: wiki-ner.as
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 374983
num_examples: 1021
- name: validation
num_bytes: 49312
num_examples: 157
- name: test
num_bytes: 50456
num_examples: 160
download_size: 72919
dataset_size: 474751
- config_name: wiki-ner.bn
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 7502824
num_examples: 20223
- name: validation
num_bytes: 988683
num_examples: 2985
- name: test
num_bytes: 985941
num_examples: 2690
download_size: 1278219
dataset_size: 9477448
- config_name: wiki-ner.gu
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 1571588
num_examples: 2343
- name: validation
num_bytes: 192804
num_examples: 297
- name: test
num_bytes: 197877
num_examples: 255
download_size: 329660
dataset_size: 1962269
- config_name: wiki-ner.hi
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 3762505
num_examples: 9463
- name: validation
num_bytes: 468678
num_examples: 1114
- name: test
num_bytes: 475253
num_examples: 1256
download_size: 948132
dataset_size: 4706436
- config_name: wiki-ner.kn
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 1352027
num_examples: 2679
- name: validation
num_bytes: 179538
num_examples: 412
- name: test
num_bytes: 180791
num_examples: 476
download_size: 421877
dataset_size: 1712356
- config_name: wiki-ner.ml
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 7678887
num_examples: 15620
- name: validation
num_bytes: 969947
num_examples: 2067
- name: test
num_bytes: 991102
num_examples: 2042
download_size: 2390442
dataset_size: 9639936
- config_name: wiki-ner.mr
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 5431489
num_examples: 12151
- name: validation
num_bytes: 701637
num_examples: 1498
- name: test
num_bytes: 655682
num_examples: 1329
download_size: 1410663
dataset_size: 6788808
- config_name: wiki-ner.or
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 493758
num_examples: 1077
- name: validation
num_bytes: 58568
num_examples: 132
- name: test
num_bytes: 62211
num_examples: 153
download_size: 102783
dataset_size: 614537
- config_name: wiki-ner.pa
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 520244
num_examples: 1408
- name: validation
num_bytes: 61170
num_examples: 186
- name: test
num_bytes: 61788
num_examples: 179
download_size: 149727
dataset_size: 643202
- config_name: wiki-ner.ta
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 10117080
num_examples: 20466
- name: validation
num_bytes: 1267188
num_examples: 2586
- name: test
num_bytes: 1321626
num_examples: 2611
download_size: 2819083
dataset_size: 12705894
- config_name: wiki-ner.te
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-LOC
'1': B-ORG
'2': B-PER
'3': I-LOC
'4': I-ORG
'5': I-PER
'6': O
- name: additional_info
sequence:
sequence: string
splits:
- name: train
num_bytes: 3881211
num_examples: 7978
- name: validation
num_bytes: 458509
num_examples: 841
- name: test
num_bytes: 507806
num_examples: 1110
download_size: 1006881
dataset_size: 4847526
- config_name: wnli.en
features:
- name: hypothesis
dtype: string
- name: premise
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
'2': None
splits:
- name: train
num_bytes: 104569
num_examples: 635
- name: validation
num_bytes: 11878
num_examples: 71
- name: test
num_bytes: 37297
num_examples: 146
download_size: 57667
dataset_size: 153744
- config_name: wnli.gu
features:
- name: hypothesis
dtype: string
- name: premise
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
'2': None
splits:
- name: train
num_bytes: 251554
num_examples: 635
- name: validation
num_bytes: 28175
num_examples: 71
- name: test
num_bytes: 94578
num_examples: 146
download_size: 98032
dataset_size: 374307
- config_name: wnli.hi
features:
- name: hypothesis
dtype: string
- name: premise
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
'2': None
splits:
- name: train
num_bytes: 253334
num_examples: 635
- name: validation
num_bytes: 28676
num_examples: 71
- name: test
num_bytes: 90823
num_examples: 146
download_size: 99450
dataset_size: 372833
- config_name: wnli.mr
features:
- name: hypothesis
dtype: string
- name: premise
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
'2': None
splits:
- name: train
num_bytes: 256649
num_examples: 635
- name: validation
num_bytes: 29218
num_examples: 71
- name: test
num_bytes: 97128
num_examples: 146
download_size: 103774
dataset_size: 382995
- config_name: wstp.as
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 13581336
num_examples: 5000
- name: validation
num_bytes: 1698968
num_examples: 625
- name: test
num_bytes: 1697650
num_examples: 626
download_size: 6959458
dataset_size: 16977954
- config_name: wstp.bn
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 143340457
num_examples: 47580
- name: validation
num_bytes: 17759236
num_examples: 5947
- name: test
num_bytes: 17633865
num_examples: 5948
download_size: 69145372
dataset_size: 178733558
- config_name: wstp.gu
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 39353464
num_examples: 10004
- name: validation
num_bytes: 4887752
num_examples: 1251
- name: test
num_bytes: 4699158
num_examples: 1251
download_size: 19763249
dataset_size: 48940374
- config_name: wstp.hi
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 158529578
num_examples: 44069
- name: validation
num_bytes: 19371904
num_examples: 5509
- name: test
num_bytes: 19593001
num_examples: 5509
download_size: 77868574
dataset_size: 197494483
- config_name: wstp.kn
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 139950313
num_examples: 35379
- name: validation
num_bytes: 17789782
num_examples: 4422
- name: test
num_bytes: 17897031
num_examples: 4423
download_size: 67719504
dataset_size: 175637126
- config_name: wstp.ml
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 88360504
num_examples: 27527
- name: validation
num_bytes: 11193340
num_examples: 3441
- name: test
num_bytes: 11150914
num_examples: 3441
download_size: 42336357
dataset_size: 110704758
- config_name: wstp.mr
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 28302341
num_examples: 10446
- name: validation
num_bytes: 3328798
num_examples: 1306
- name: test
num_bytes: 3631684
num_examples: 1306
download_size: 13886208
dataset_size: 35262823
- config_name: wstp.or
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 10900006
num_examples: 4015
- name: validation
num_bytes: 1264935
num_examples: 502
- name: test
num_bytes: 1344652
num_examples: 502
download_size: 5319128
dataset_size: 13509593
- config_name: wstp.pa
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 22189730
num_examples: 8772
- name: validation
num_bytes: 2789186
num_examples: 1097
- name: test
num_bytes: 2685767
num_examples: 1097
download_size: 11201369
dataset_size: 27664683
- config_name: wstp.ta
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 151929218
num_examples: 48940
- name: validation
num_bytes: 18817167
num_examples: 6117
- name: test
num_bytes: 18815071
num_examples: 6118
download_size: 68699092
dataset_size: 189561456
- config_name: wstp.te
features:
- name: sectionText
dtype: string
- name: correctTitle
dtype: string
- name: titleA
dtype: string
- name: titleB
dtype: string
- name: titleC
dtype: string
- name: titleD
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 151696691
num_examples: 80000
- name: validation
num_bytes: 19003169
num_examples: 10000
- name: test
num_bytes: 18991913
num_examples: 10000
download_size: 50158580
dataset_size: 189691773
configs:
- config_name: actsa-sc.te
data_files:
- split: train
path: actsa-sc.te/train-*
- split: validation
path: actsa-sc.te/validation-*
- split: test
path: actsa-sc.te/test-*
- config_name: bbca.hi
data_files:
- split: train
path: bbca.hi/train-*
- split: test
path: bbca.hi/test-*
- config_name: copa.en
data_files:
- split: train
path: copa.en/train-*
- split: validation
path: copa.en/validation-*
- split: test
path: copa.en/test-*
- config_name: copa.gu
data_files:
- split: train
path: copa.gu/train-*
- split: validation
path: copa.gu/validation-*
- split: test
path: copa.gu/test-*
- config_name: copa.hi
data_files:
- split: train
path: copa.hi/train-*
- split: validation
path: copa.hi/validation-*
- split: test
path: copa.hi/test-*
- config_name: copa.mr
data_files:
- split: train
path: copa.mr/train-*
- split: validation
path: copa.mr/validation-*
- split: test
path: copa.mr/test-*
- config_name: csqa.as
data_files:
- split: test
path: csqa.as/test-*
- config_name: csqa.bn
data_files:
- split: test
path: csqa.bn/test-*
- config_name: csqa.gu
data_files:
- split: test
path: csqa.gu/test-*
- config_name: csqa.hi
data_files:
- split: test
path: csqa.hi/test-*
- config_name: csqa.kn
data_files:
- split: test
path: csqa.kn/test-*
- config_name: csqa.ml
data_files:
- split: test
path: csqa.ml/test-*
- config_name: csqa.mr
data_files:
- split: test
path: csqa.mr/test-*
- config_name: csqa.or
data_files:
- split: test
path: csqa.or/test-*
- config_name: csqa.pa
data_files:
- split: test
path: csqa.pa/test-*
- config_name: csqa.ta
data_files:
- split: test
path: csqa.ta/test-*
- config_name: csqa.te
data_files:
- split: test
path: csqa.te/test-*
- config_name: cvit-mkb-clsr.en-bn
data_files:
- split: test
path: cvit-mkb-clsr.en-bn/test-*
- config_name: cvit-mkb-clsr.en-gu
data_files:
- split: test
path: cvit-mkb-clsr.en-gu/test-*
- config_name: cvit-mkb-clsr.en-hi
data_files:
- split: test
path: cvit-mkb-clsr.en-hi/test-*
- config_name: cvit-mkb-clsr.en-ml
data_files:
- split: test
path: cvit-mkb-clsr.en-ml/test-*
- config_name: cvit-mkb-clsr.en-mr
data_files:
- split: test
path: cvit-mkb-clsr.en-mr/test-*
- config_name: cvit-mkb-clsr.en-or
data_files:
- split: test
path: cvit-mkb-clsr.en-or/test-*
- config_name: cvit-mkb-clsr.en-ta
data_files:
- split: test
path: cvit-mkb-clsr.en-ta/test-*
- config_name: cvit-mkb-clsr.en-te
data_files:
- split: test
path: cvit-mkb-clsr.en-te/test-*
- config_name: cvit-mkb-clsr.en-ur
data_files:
- split: test
path: cvit-mkb-clsr.en-ur/test-*
- config_name: iitp-mr.hi
data_files:
- split: train
path: iitp-mr.hi/train-*
- split: validation
path: iitp-mr.hi/validation-*
- split: test
path: iitp-mr.hi/test-*
- config_name: iitp-pr.hi
data_files:
- split: train
path: iitp-pr.hi/train-*
- split: validation
path: iitp-pr.hi/validation-*
- split: test
path: iitp-pr.hi/test-*
- config_name: inltkh.gu
data_files:
- split: train
path: inltkh.gu/train-*
- split: validation
path: inltkh.gu/validation-*
- split: test
path: inltkh.gu/test-*
- config_name: inltkh.ml
data_files:
- split: train
path: inltkh.ml/train-*
- split: validation
path: inltkh.ml/validation-*
- split: test
path: inltkh.ml/test-*
- config_name: inltkh.mr
data_files:
- split: train
path: inltkh.mr/train-*
- split: validation
path: inltkh.mr/validation-*
- split: test
path: inltkh.mr/test-*
- config_name: inltkh.ta
data_files:
- split: train
path: inltkh.ta/train-*
- split: validation
path: inltkh.ta/validation-*
- split: test
path: inltkh.ta/test-*
- config_name: inltkh.te
data_files:
- split: train
path: inltkh.te/train-*
- split: validation
path: inltkh.te/validation-*
- split: test
path: inltkh.te/test-*
- config_name: md.hi
data_files:
- split: train
path: md.hi/train-*
- split: validation
path: md.hi/validation-*
- split: test
path: md.hi/test-*
- config_name: sna.bn
data_files:
- split: train
path: sna.bn/train-*
- split: validation
path: sna.bn/validation-*
- split: test
path: sna.bn/test-*
- config_name: wiki-ner.as
data_files:
- split: train
path: wiki-ner.as/train-*
- split: validation
path: wiki-ner.as/validation-*
- split: test
path: wiki-ner.as/test-*
- config_name: wiki-ner.bn
data_files:
- split: train
path: wiki-ner.bn/train-*
- split: validation
path: wiki-ner.bn/validation-*
- split: test
path: wiki-ner.bn/test-*
- config_name: wiki-ner.gu
data_files:
- split: train
path: wiki-ner.gu/train-*
- split: validation
path: wiki-ner.gu/validation-*
- split: test
path: wiki-ner.gu/test-*
- config_name: wiki-ner.hi
data_files:
- split: train
path: wiki-ner.hi/train-*
- split: validation
path: wiki-ner.hi/validation-*
- split: test
path: wiki-ner.hi/test-*
- config_name: wiki-ner.kn
data_files:
- split: train
path: wiki-ner.kn/train-*
- split: validation
path: wiki-ner.kn/validation-*
- split: test
path: wiki-ner.kn/test-*
- config_name: wiki-ner.ml
data_files:
- split: train
path: wiki-ner.ml/train-*
- split: validation
path: wiki-ner.ml/validation-*
- split: test
path: wiki-ner.ml/test-*
- config_name: wiki-ner.mr
data_files:
- split: train
path: wiki-ner.mr/train-*
- split: validation
path: wiki-ner.mr/validation-*
- split: test
path: wiki-ner.mr/test-*
- config_name: wiki-ner.or
data_files:
- split: train
path: wiki-ner.or/train-*
- split: validation
path: wiki-ner.or/validation-*
- split: test
path: wiki-ner.or/test-*
- config_name: wiki-ner.pa
data_files:
- split: train
path: wiki-ner.pa/train-*
- split: validation
path: wiki-ner.pa/validation-*
- split: test
path: wiki-ner.pa/test-*
- config_name: wiki-ner.ta
data_files:
- split: train
path: wiki-ner.ta/train-*
- split: validation
path: wiki-ner.ta/validation-*
- split: test
path: wiki-ner.ta/test-*
- config_name: wiki-ner.te
data_files:
- split: train
path: wiki-ner.te/train-*
- split: validation
path: wiki-ner.te/validation-*
- split: test
path: wiki-ner.te/test-*
- config_name: wnli.en
data_files:
- split: train
path: wnli.en/train-*
- split: validation
path: wnli.en/validation-*
- split: test
path: wnli.en/test-*
- config_name: wnli.gu
data_files:
- split: train
path: wnli.gu/train-*
- split: validation
path: wnli.gu/validation-*
- split: test
path: wnli.gu/test-*
- config_name: wnli.hi
data_files:
- split: train
path: wnli.hi/train-*
- split: validation
path: wnli.hi/validation-*
- split: test
path: wnli.hi/test-*
- config_name: wnli.mr
data_files:
- split: train
path: wnli.mr/train-*
- split: validation
path: wnli.mr/validation-*
- split: test
path: wnli.mr/test-*
- config_name: wstp.as
data_files:
- split: train
path: wstp.as/train-*
- split: validation
path: wstp.as/validation-*
- split: test
path: wstp.as/test-*
- config_name: wstp.bn
data_files:
- split: train
path: wstp.bn/train-*
- split: validation
path: wstp.bn/validation-*
- split: test
path: wstp.bn/test-*
- config_name: wstp.gu
data_files:
- split: train
path: wstp.gu/train-*
- split: validation
path: wstp.gu/validation-*
- split: test
path: wstp.gu/test-*
- config_name: wstp.hi
data_files:
- split: train
path: wstp.hi/train-*
- split: validation
path: wstp.hi/validation-*
- split: test
path: wstp.hi/test-*
- config_name: wstp.kn
data_files:
- split: train
path: wstp.kn/train-*
- split: validation
path: wstp.kn/validation-*
- split: test
path: wstp.kn/test-*
- config_name: wstp.ml
data_files:
- split: train
path: wstp.ml/train-*
- split: validation
path: wstp.ml/validation-*
- split: test
path: wstp.ml/test-*
- config_name: wstp.mr
data_files:
- split: train
path: wstp.mr/train-*
- split: validation
path: wstp.mr/validation-*
- split: test
path: wstp.mr/test-*
- config_name: wstp.or
data_files:
- split: train
path: wstp.or/train-*
- split: validation
path: wstp.or/validation-*
- split: test
path: wstp.or/test-*
- config_name: wstp.pa
data_files:
- split: train
path: wstp.pa/train-*
- split: validation
path: wstp.pa/validation-*
- split: test
path: wstp.pa/test-*
- config_name: wstp.ta
data_files:
- split: train
path: wstp.ta/train-*
- split: validation
path: wstp.ta/validation-*
- split: test
path: wstp.ta/test-*
- config_name: wstp.te
data_files:
- split: train
path: wstp.te/train-*
- split: validation
path: wstp.te/validation-*
- split: test
path: wstp.te/test-*
---
# Dataset Card for "indic_glue"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://ai4bharat.iitm.ac.in/indic-glue
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [IndicNLPSuite: Monolingual Corpora, Evaluation Benchmarks and Pre-trained Multilingual Language Models for Indian Languages](https://aclanthology.org/2020.findings-emnlp.445/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3.51 GB
- **Size of the generated dataset:** 1.65 GB
- **Total amount of disk used:** 5.16 GB
### Dataset Summary
IndicGLUE is a natural language understanding benchmark for Indian languages. It contains a wide
variety of tasks and covers 11 major Indian languages - as, bn, gu, hi, kn, ml, mr, or, pa, ta, te.
The Winograd Schema Challenge (Levesque et al., 2011) is a reading comprehension task
in which a system must read a sentence with a pronoun and select the referent of that pronoun from
a list of choices. The examples are manually constructed to foil simple statistical methods: Each
one is contingent on contextual information provided by a single word or phrase in the sentence.
To convert the problem into sentence pair classification, we construct sentence pairs by replacing
the ambiguous pronoun with each possible referent. The task is to predict if the sentence with the
pronoun substituted is entailed by the original sentence. We use a small evaluation set consisting of
new examples derived from fiction books that was shared privately by the authors of the original
corpus. While the included training set is balanced between two classes, the test set is imbalanced
between them (65% not entailment). Also, due to a data quirk, the development set is adversarial:
hypotheses are sometimes shared between training and development examples, so if a model memorizes the
training examples, they will predict the wrong label on corresponding development set
example. As with QNLI, each example is evaluated separately, so there is not a systematic correspondence
between a model's score on this task and its score on the unconverted original task. We
call converted dataset WNLI (Winograd NLI). This dataset is translated and publicly released for 3
Indian languages by AI4Bharat.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### actsa-sc.te
- **Size of downloaded dataset files:** 0.38 MB
- **Size of the generated dataset:** 1.71 MB
- **Total amount of disk used:** 2.09 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"label": 0,
"text": "\"ప్రయాణాల్లో ఉన్నవారికోసం బస్ స్టేషన్లు, రైల్వే స్టేషన్లలో పల్స్పోలియో బూతులను ఏర్పాటు చేసి చిన్నారులకు పోలియో చుక్కలు వేసేలా ఏర..."
}
```
#### bbca.hi
- **Size of downloaded dataset files:** 5.77 MB
- **Size of the generated dataset:** 27.63 MB
- **Total amount of disk used:** 33.40 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"label": "pakistan",
"text": "\"नेटिजन यानि इंटरनेट पर सक्रिय नागरिक अब ट्विटर पर सरकार द्वारा लगाए प्रतिबंधों के समर्थन या विरोध में अपने विचार व्यक्त करते है..."
}
```
#### copa.en
- **Size of downloaded dataset files:** 0.75 MB
- **Size of the generated dataset:** 0.12 MB
- **Total amount of disk used:** 0.87 MB
An example of 'validation' looks as follows.
```
{
"choice1": "I swept the floor in the unoccupied room.",
"choice2": "I shut off the light in the unoccupied room.",
"label": 1,
"premise": "I wanted to conserve energy.",
"question": "effect"
}
```
#### copa.gu
- **Size of downloaded dataset files:** 0.75 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.99 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"choice1": "\"સ્ત્રી જાણતી હતી કે તેનો મિત્ર મુશ્કેલ સમયમાંથી પસાર થઈ રહ્યો છે.\"...",
"choice2": "\"મહિલાને લાગ્યું કે તેના મિત્રએ તેની દયાળુ લાભ લીધો છે.\"...",
"label": 0,
"premise": "મહિલાએ તેના મિત્રની મુશ્કેલ વર્તન સહન કરી.",
"question": "cause"
}
```
#### copa.hi
- **Size of downloaded dataset files:** 0.75 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.99 MB
An example of 'validation' looks as follows.
```
{
"choice1": "मैंने उसका प्रस्ताव ठुकरा दिया।",
"choice2": "उन्होंने मुझे उत्पाद खरीदने के लिए राजी किया।",
"label": 0,
"premise": "मैंने सेल्समैन की पिच पर शक किया।",
"question": "effect"
}
```
### Data Fields
The data fields are the same among all splits.
#### actsa-sc.te
- `text`: a `string` feature.
- `label`: a classification label, with possible values including `positive` (0), `negative` (1).
#### bbca.hi
- `label`: a `string` feature.
- `text`: a `string` feature.
#### copa.en
- `premise`: a `string` feature.
- `choice1`: a `string` feature.
- `choice2`: a `string` feature.
- `question`: a `string` feature.
- `label`: a `int32` feature.
#### copa.gu
- `premise`: a `string` feature.
- `choice1`: a `string` feature.
- `choice2`: a `string` feature.
- `question`: a `string` feature.
- `label`: a `int32` feature.
#### copa.hi
- `premise`: a `string` feature.
- `choice1`: a `string` feature.
- `choice2`: a `string` feature.
- `question`: a `string` feature.
- `label`: a `int32` feature.
### Data Splits
#### actsa-sc.te
| |train|validation|test|
|-----------|----:|---------:|---:|
|actsa-sc.te| 4328| 541| 541|
#### bbca.hi
| |train|test|
|-------|----:|---:|
|bbca.hi| 3467| 866|
#### copa.en
| |train|validation|test|
|-------|----:|---------:|---:|
|copa.en| 400| 100| 500|
#### copa.gu
| |train|validation|test|
|-------|----:|---------:|---:|
|copa.gu| 362| 88| 448|
#### copa.hi
| |train|validation|test|
|-------|----:|---------:|---:|
|copa.hi| 362| 88| 449|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{kakwani-etal-2020-indicnlpsuite,
title = "{I}ndic{NLPS}uite: Monolingual Corpora, Evaluation Benchmarks and Pre-trained Multilingual Language Models for {I}ndian Languages",
author = "Kakwani, Divyanshu and
Kunchukuttan, Anoop and
Golla, Satish and
N.C., Gokul and
Bhattacharyya, Avik and
Khapra, Mitesh M. and
Kumar, Pratyush",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.findings-emnlp.445",
doi = "10.18653/v1/2020.findings-emnlp.445",
pages = "4948--4961",
}
@inproceedings{Levesque2011TheWS,
title={The Winograd Schema Challenge},
author={H. Levesque and E. Davis and L. Morgenstern},
booktitle={KR},
year={2011}
}
```
### Contributions
Thanks to [@sumanthd17](https://github.com/sumanthd17) for adding this dataset. |
bharath32/Medicare_testing | ---
license: other
---
|
NiranjanAndhe/dataset | ---
license: other
---
|
kopyl/mapped-pokemon-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: prompt_embeds
sequence:
sequence: float32
- name: pooled_prompt_embeds
sequence: float32
- name: model_input
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 869477161.0
num_examples: 833
download_size: 851613359
dataset_size: 869477161.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mekaneeky/Processed-Luganda-SpeechT5-with-SALT-translation-11-7-23 | ---
dataset_info:
features:
- name: audio
sequence:
sequence: float32
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: encoder_input_values
sequence:
sequence: float32
- name: encoder_attention_mask
sequence:
sequence: int32
- name: acholi_transcription
dtype: string
- name: lugbara_transcription
dtype: string
- name: english_transcription
dtype: string
- name: runyankole_transcription
dtype: string
- name: ateso_transcription
dtype: string
splits:
- name: train
num_bytes: 43512528901
num_examples: 32352
- name: validation
num_bytes: 547401321
num_examples: 407
download_size: 9842097693
dataset_size: 44059930222
---
# Dataset Card for "Processed-Luganda-SpeechT5-with-SALT-translation-11-7-23"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ro635_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ro635/RO635/RO635 (Girls' Frontline)
This is the dataset of ro635/RO635/RO635 (Girls' Frontline), containing 494 images and their tags.
The core tags of this character are `long_hair, black_hair, multicolored_hair, streaked_hair, yellow_eyes, white_hair, heterochromia, bangs, red_eyes, breasts, twintails, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 494 | 669.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ro635_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 494 | 355.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ro635_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1151 | 743.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ro635_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 494 | 585.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ro635_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1151 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ro635_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ro635_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_gloves, simple_background, solo, yellow_jacket, looking_at_viewer, white_background, fingerless_gloves, white_shirt, black_skirt, holding, megaphone, orange_eyes, open_mouth, pleated_skirt, blush, hooded_jacket, open_jacket |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, black_skirt, feet_out_of_frame, holding_gun, open_jacket, solo, yellow_jacket, brown_sweater_vest, closed_mouth, fingerless_gloves, looking_at_viewer, medium_breasts, orange_eyes, standing, white_shirt, blush, headphones, rifle, hairclip, id_card, simple_background, white_background |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, brown_sweater_vest, id_card, mask_around_neck, mod3_(girls'_frontline), respirator, solo, submachine_gun, white_background, yellow_jacket, holding_gun, open_jacket, simple_background, bare_shoulders, black_skirt, blush, looking_at_viewer, standing, ammunition_belt, closed_mouth, feet_out_of_frame, jacket_pull, megaphone, open_mouth |
| 3 | 8 |  |  |  |  |  | 1girl, black_gloves, holding_gun, mask_around_neck, mod3_(girls'_frontline), respirator, solo, submachine_gun, yellow_jacket, id_card, looking_at_viewer, brown_sweater_vest, lanyard, standing, black_skirt, closed_mouth, knee_pads, bag, black_thighhighs, feet_out_of_frame, open_clothes, pouch |
| 4 | 5 |  |  |  |  |  | 1girl, brown_sweater_vest, jacket_pull, mod3_(girls'_frontline), open_jacket, open_mouth, yellow_jacket, blush, hair_ornament, solo, upper_body, black_gloves, looking_at_viewer, medium_breasts, bare_shoulders, id_card, lanyard |
| 5 | 15 |  |  |  |  |  | 1girl, blush, solo, armpits, looking_at_viewer, mod3_(girls'_frontline), sleeveless_sweater, arms_up, lanyard, arms_behind_head, id_card, upper_body, brown_sweater, closed_mouth, simple_background, on_back |
| 6 | 9 |  |  |  |  |  | 1girl, solo, black_one-piece_swimsuit, competition_swimsuit, looking_at_viewer, simple_background, white_background, collarbone, covered_navel, standing, cowboy_shot, blush, closed_mouth, highleg_swimsuit |
| 7 | 6 |  |  |  |  |  | 2girls, blush, lanyard, simple_background, skirt, solo_focus, white_background, black_gloves, chibi, sweatdrop, yellow_jacket, open_mouth |
| 8 | 19 |  |  |  |  |  | 1girl, solo, black_gloves, looking_at_viewer, cleavage, simple_background, white_background, official_alternate_costume, blush, black_pantyhose, holding, black_dress, drinking_glass, bare_shoulders, butterfly_hair_ornament, closed_mouth |
| 9 | 10 |  |  |  |  |  | 1girl, cleavage, navel, solo, blush, collarbone, black_panties, looking_at_viewer, black_bra, underwear_only, bare_shoulders, stomach, white_background, closed_mouth, lingerie, medium_breasts, simple_background |
| 10 | 7 |  |  |  |  |  | 1girl, cleavage, collarbone, solo, bare_shoulders, black_dress, looking_at_viewer, necklace, simple_background, white_background, blush, low_twintails, medium_breasts, open_mouth, smile, standing, alternate_costume, holding_instrument |
| 11 | 5 |  |  |  |  |  | 1girl, black_leotard, black_pantyhose, blush, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, solo, white_background, cleavage, looking_at_viewer, simple_background, wrist_cuffs, black_bowtie, holding, alternate_costume, artist_name, closed_mouth, covered_navel, hand_on_hip, low_twintails, megaphone, standing, wedding_ring, yellow_jacket |
| 12 | 15 |  |  |  |  |  | 1girl, hetero, 1boy, blush, completely_nude, navel, penis, solo_focus, bar_censor, nipples, cum, pussy, sex, vaginal, open_mouth, sweat, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | simple_background | solo | yellow_jacket | looking_at_viewer | white_background | fingerless_gloves | white_shirt | black_skirt | holding | megaphone | orange_eyes | open_mouth | pleated_skirt | blush | hooded_jacket | open_jacket | feet_out_of_frame | holding_gun | brown_sweater_vest | closed_mouth | medium_breasts | standing | headphones | rifle | hairclip | id_card | mask_around_neck | mod3_(girls'_frontline) | respirator | submachine_gun | bare_shoulders | ammunition_belt | jacket_pull | lanyard | knee_pads | bag | black_thighhighs | open_clothes | pouch | hair_ornament | upper_body | armpits | sleeveless_sweater | arms_up | arms_behind_head | brown_sweater | on_back | black_one-piece_swimsuit | competition_swimsuit | collarbone | covered_navel | cowboy_shot | highleg_swimsuit | 2girls | skirt | solo_focus | chibi | sweatdrop | cleavage | official_alternate_costume | black_pantyhose | black_dress | drinking_glass | butterfly_hair_ornament | navel | black_panties | black_bra | underwear_only | stomach | lingerie | necklace | low_twintails | smile | alternate_costume | holding_instrument | black_leotard | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | wrist_cuffs | black_bowtie | artist_name | hand_on_hip | wedding_ring | hetero | 1boy | completely_nude | penis | bar_censor | nipples | cum | pussy | sex | vaginal | sweat |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:-------|:----------------|:--------------------|:-------------------|:--------------------|:--------------|:--------------|:----------|:------------|:--------------|:-------------|:----------------|:--------|:----------------|:--------------|:--------------------|:--------------|:---------------------|:---------------|:-----------------|:-----------|:-------------|:--------|:-----------|:----------|:-------------------|:--------------------------|:-------------|:-----------------|:-----------------|:------------------|:--------------|:----------|:------------|:------|:-------------------|:---------------|:--------|:----------------|:-------------|:----------|:---------------------|:----------|:-------------------|:----------------|:----------|:---------------------------|:-----------------------|:-------------|:----------------|:--------------|:-------------------|:---------|:--------|:-------------|:--------|:------------|:-----------|:-----------------------------|:------------------|:--------------|:-----------------|:--------------------------|:--------|:----------------|:------------|:-----------------|:----------|:-----------|:-----------|:----------------|:--------|:--------------------|:---------------------|:----------------|:------------------|:-------------------|:----------------|:--------------|:--------------|:---------------|:--------------|:--------------|:---------------|:---------|:-------|:------------------|:--------|:-------------|:----------|:------|:--------|:------|:----------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | | X | | X | | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | X | X | | | | X | | | | | | | | | X | X | X | X | | X | | | | X | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | X | | | | | | | | X | | X | | X | | | X | | X | | | | | X | | X | | | X | | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | X | | | | | | X | | | | | | X | | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | | X | X | | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 19 |  |  |  |  |  | X | X | X | X | | X | X | | | | X | | | | | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | | | X | | | | | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | X | | X | | | | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | | | X | X | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 12 | 15 |  |  |  |  |  | X | | | | | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-staging-eval-project-adversarial_qa-e34332b7-12205627 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: deepset/xlm-roberta-base-squad2-distilled
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/xlm-roberta-base-squad2-distilled
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ceyda](https://huggingface.co/ceyda) for evaluating this model. |
Sakshamrzt/IndicNLP-Gujarati | ---
license: cc-by-nc-4.0
dataset_info:
- config_name: train
features:
- name: news
dtype: string
- name: class
dtype: float64
splits:
- name: train
num_examples: 1018
configs:
- config_name: train
data_files:
- split: train
path: gujaratitrain.jsonl
- config_name: test
data_files:
- split: test
path: gujaratitest.jsonl
task_categories:
- text-classification
language:
- gu
--- |
oakink/OakInk-v1 | ---
license: cc-by-nc-sa-3.0
task_categories:
- image-to-3d
language:
- en
size_categories:
- 100K<n<1M
---
# Dataset Card for OakInk-v1
- **Project:** https://oakink.net
- **Code:** https://github.com/oakink/OakInk
- **Paper:** https://arxiv.org/abs/2203.15709
|
gadams/ruby | ---
license: other
---
|
TesterSet/fundacao | ---
license: openrail
---
|
elyadenysova/sileod_mindgames_inference | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4510783
num_examples: 11174
- name: validation
num_bytes: 1504634
num_examples: 3725
- name: test
num_bytes: 1512203
num_examples: 3725
download_size: 1331638
dataset_size: 7527620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
davozing223/jonasmaneiro | ---
license: openrail
---
|
Seanxh/twitter_dataset_1713074226 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 167700
num_examples: 413
download_size: 56357
dataset_size: 167700
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/v3_train_free_concat_23 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842471584
num_examples: 2500
download_size: 1761575747
dataset_size: 3842471584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
malaysia-ai/Wikipedia-Malaysian-Politicians-multiturn | ---
language:
- ms
---
# Wikipedia-Malaysian-Politicians multiturn
Original dataset at https://huggingface.co/datasets/Englios/Wikipedia-Malaysian-Politicians, we just translate and prepare multi-turn chat template. |
marup/WeiChenRVC | ---
license: openrail
---
|
CuiMuxuan/bert-vits2 | ---
license: openrail
---
|
open-llm-leaderboard/details_budecosystem__genz-13b-v2 | ---
pretty_name: Evaluation run of budecosystem/genz-13b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/genz-13b-v2](https://huggingface.co/budecosystem/genz-13b-v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__genz-13b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:10:42.007664](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-13b-v2/blob/main/results_2023-09-22T15-10-42.007664.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1649538590604027,\n\
\ \"em_stderr\": 0.0038008097202810163,\n \"f1\": 0.2284354026845635,\n\
\ \"f1_stderr\": 0.003875004173850451,\n \"acc\": 0.434338336007104,\n\
\ \"acc_stderr\": 0.010638707911291463\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1649538590604027,\n \"em_stderr\": 0.0038008097202810163,\n\
\ \"f1\": 0.2284354026845635,\n \"f1_stderr\": 0.003875004173850451\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874659\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708266\n\
\ }\n}\n```"
repo_url: https://huggingface.co/budecosystem/genz-13b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|drop|3_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-10-42.007664.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-10-42.007664.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|winogrande|5_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-10-42.007664.parquet'
- config_name: results
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- results_2023-09-01T20:10:58.208495.parquet
- split: 2023_09_22T15_10_42.007664
path:
- results_2023-09-22T15-10-42.007664.parquet
- split: latest
path:
- results_2023-09-22T15-10-42.007664.parquet
---
# Dataset Card for Evaluation run of budecosystem/genz-13b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/genz-13b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/genz-13b-v2](https://huggingface.co/budecosystem/genz-13b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__genz-13b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:10:42.007664](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-13b-v2/blob/main/results_2023-09-22T15-10-42.007664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1649538590604027,
"em_stderr": 0.0038008097202810163,
"f1": 0.2284354026845635,
"f1_stderr": 0.003875004173850451,
"acc": 0.434338336007104,
"acc_stderr": 0.010638707911291463
},
"harness|drop|3": {
"em": 0.1649538590604027,
"em_stderr": 0.0038008097202810163,
"f1": 0.2284354026845635,
"f1_stderr": 0.003875004173850451
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874659
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708266
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/find_last_sent_train_10_eval_10_hint3 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 39585
num_examples: 30
- name: validation
num_bytes: 9250
num_examples: 10
download_size: 45630
dataset_size: 48835
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_10_eval_10_hint3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sultu/Kelvin | ---
license: openrail
---
|
mar-yam1497/HotPotQA_Mistral_Instruct_dataset_Top3k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15440892
num_examples: 3000
download_size: 6912344
dataset_size: 15440892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Anoop03031988/news_summary | ---
license: apache-2.0
language:
- en
pretty_name: news_summarizer
--- |
open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B | ---
pretty_name: Evaluation run of ajibawa-2023/Uncensored-Frank-33B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ajibawa-2023/Uncensored-Frank-33B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-30T04:12:18.796375](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B/blob/main/results_2023-10-30T04-12-18.796375.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17554530201342283,\n\
\ \"em_stderr\": 0.0038959884031644423,\n \"f1\": 0.2628104026845651,\n\
\ \"f1_stderr\": 0.003991015722513057,\n \"acc\": 0.4661905140880088,\n\
\ \"acc_stderr\": 0.01108732307443375\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17554530201342283,\n \"em_stderr\": 0.0038959884031644423,\n\
\ \"f1\": 0.2628104026845651,\n \"f1_stderr\": 0.003991015722513057\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16679302501895377,\n \
\ \"acc_stderr\": 0.010268516042629513\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_30T04_12_18.796375
path:
- '**/details_harness|drop|3_2023-10-30T04-12-18.796375.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-30T04-12-18.796375.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_30T04_12_18.796375
path:
- '**/details_harness|gsm8k|5_2023-10-30T04-12-18.796375.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-30T04-12-18.796375.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-30-05.303429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-30-05.303429.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_30T04_12_18.796375
path:
- '**/details_harness|winogrande|5_2023-10-30T04-12-18.796375.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-30T04-12-18.796375.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_30_05.303429
path:
- results_2023-10-03T17-30-05.303429.parquet
- split: 2023_10_30T04_12_18.796375
path:
- results_2023-10-30T04-12-18.796375.parquet
- split: latest
path:
- results_2023-10-30T04-12-18.796375.parquet
---
# Dataset Card for Evaluation run of ajibawa-2023/Uncensored-Frank-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Uncensored-Frank-33B](https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-30T04:12:18.796375](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B/blob/main/results_2023-10-30T04-12-18.796375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17554530201342283,
"em_stderr": 0.0038959884031644423,
"f1": 0.2628104026845651,
"f1_stderr": 0.003991015722513057,
"acc": 0.4661905140880088,
"acc_stderr": 0.01108732307443375
},
"harness|drop|3": {
"em": 0.17554530201342283,
"em_stderr": 0.0038959884031644423,
"f1": 0.2628104026845651,
"f1_stderr": 0.003991015722513057
},
"harness|gsm8k|5": {
"acc": 0.16679302501895377,
"acc_stderr": 0.010268516042629513
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nickolass/Fiddle | ---
license: openrail
---
|
gabeorlanski/bc-humaneval | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- en
tags:
- code
pretty_name: BabelCode HumanEval
size_categories:
- 1K<n<10K
source_datasets:
- original
- extended|openai_humaneval
---
# Dataset Card for BabelCode HumanEval
## Dataset Description
- **Repository:** [GitHub Repository](https://github.com/google-research/babelcode)
- **Paper:** [Measuring The Impact Of Programming Language Distribution](https://arxiv.org/abs/2302.01973)
### How To Use This Dataset
To use this dataset, you can either use the original [BabelCode Repo](https://github.com/google-research/babelcode), or you can use the [`bc_eval` Metric](https://huggingface.co/spaces/gabeorlanski/bc_eval).
### Dataset Summary
The BabelCode-HumaneEval (BC-HumanEval) dataset converts the [HumanEval dataset released by OpenAI](https://github.com/openai/human-eval) to 16 programming languages.
### Supported Tasks and Leaderboards
### Languages
BC-HumanEval supports:
* C++
* C#
* Dart
* Go
* Haskell
* Java
* Javascript
* Julia
* Kotlin
* Lua
* PHP
* Python
* R
* Rust
* Scala
* TypeScript
## Dataset Structure
```python
>>> from datasets import load_dataset
>>> load_dataset("gabeorlanski/bc-humaneval")
DatasetDict({
test: Dataset({
features: ['qid', 'title', 'language', 'text', 'signature_with_docstring', 'signature', 'arguments', 'solution', 'question_info'],
num_rows: 2576
})
})
```
### Data Fields
- `qid`: The question ID used for running tests.
- `title`: The title of the question.
- `language`: The programming language of the example.
- `text`: The description of the problem.
- `signature`: The signature for the problem.
- `signature_with_docstring`: The signature with the adequately formatted docstring for the given problem.
- `arguments`: The arguments of the problem.
- `solution`: The solution in Python.
- `question_info`: The dict of information used for executing predictions. It has the keys:
- `test_code`: The raw testing script used in the language. If you want to use this, replace `PLACEHOLDER_FN_NAME` (and `PLACEHOLDER_CLS_NAME` if needed) with the corresponding entry points. Next, replace `PLACEHOLDER_CODE_BODY` with the postprocessed prediction.
- `test_list`: The raw json line of the list of tests for the problem. To load them, use `json.loads`
- `test_case_ids`: The list of test case ids for the problem. These are used to determine if a prediction passes or not.
- `entry_fn_name`: The function's name to use an entry point.
- `entry_cls_name`: The class name to use an entry point.
- `commands`: The commands used to execute the prediction. Includes a `__FILENAME__` hole that is replaced with the filename.
- `timeouts`: The default timeouts for each command.
- `extension`: The extension for the prediction file.
**NOTE:** If you want to use a different function name (or class name for languages that require class names) for the prediction, you must update the `entry_fn_name` and `entry_cls_name` accordingly. For example, if you have the original question with `entry_fn_name` of `add`, but want to change it to `f`, you must update `ds["question_info"]["entry_fn_name"]` to `f`:
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("gabeorlanski/bc-humaneval")['test']
>>> # The original entry_fn_name
>>> ds[0]['question_info']['entry_fn_name']
hasCloseElements
>>> # You MUST update the corresponding entry_fn_name
>>> ds[0]['question_info']['entry_fn_name'] = 'f'
>>> ds[0]['question_info']['entry_fn_name']
f
```
## Dataset Creation
See section 2 of the [BabelCode Paper](https://arxiv.org/abs/2302.01973) to learn more about how the datasets are translated.
For information on how the original HumanEval was curated, please see the [Evaluating Large Language Models Trained on Code paper](https://arxiv.org/abs/2107.03374).
### Dataset Curators
Google Research
### Licensing Information
CC-BY-4.0
### Citation Information
```
@article{orlanski2023measuring,
title={Measuring The Impact Of Programming Language Distribution},
author={Orlanski, Gabriel and Xiao, Kefan and Garcia, Xavier and Hui, Jeffrey and Howland, Joshua and Malmaud, Jonathan and Austin, Jacob and Singh, Rishah and Catasta, Michele},
journal={arXiv preprint arXiv:2302.01973},
year={2023}
}
@article{chen2021codex,
title={Evaluating Large Language Models Trained on Code},
author={Mark Chen and Jerry Tworek and Heewoo Jun and Qiming Yuan and Henrique Ponde de Oliveira Pinto and Jared Kaplan and Harri Edwards and Yuri Burda and Nicholas Joseph and Greg Brockman and Alex Ray and Raul Puri and Gretchen Krueger and Michael Petrov and Heidy Khlaaf and Girish Sastry and Pamela Mishkin and Brooke Chan and Scott Gray and Nick Ryder and Mikhail Pavlov and Alethea Power and Lukasz Kaiser and Mohammad Bavarian and Clemens Winter and Philippe Tillet and Felipe Petroski Such and Dave Cummings and Matthias Plappert and Fotios Chantzis and Elizabeth Barnes and Ariel Herbert-Voss and William Hebgen Guss and Alex Nichol and Alex Paino and Nikolas Tezak and Jie Tang and Igor Babuschkin and Suchir Balaji and Shantanu Jain and William Saunders and Christopher Hesse and Andrew N. Carr and Jan Leike and Josh Achiam and Vedant Misra and Evan Morikawa and Alec Radford and Matthew Knight and Miles Brundage and Mira Murati and Katie Mayer and Peter Welinder and Bob McGrew and Dario Amodei and Sam McCandlish and Ilya Sutskever and Wojciech Zaremba},
year={2021},
eprint={2107.03374},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
HuggingFaceTB/web_under_line_mean_100 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: summary
dtype: string
- name: examples
dtype: string
- name: __index_level_0__
dtype: int64
- name: category
dtype: string
- name: educational_score
dtype: string
- name: generation_type
dtype: string
- name: line_mean
dtype: float64
- name: line_max
dtype: int64
splits:
- name: train
num_bytes: 5059512.492
num_examples: 1160
download_size: 1828809
dataset_size: 5059512.492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/moriyama_nanaki_fatekaleidlinerprismaillya | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Moriyama Nanaki
This is the dataset of Moriyama Nanaki, containing 132 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 132 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 251 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 132 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 132 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 132 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 132 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 132 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 251 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 251 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 251 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
andersonbcdefg/github_issues_markdown | ---
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 84836992
num_examples: 18565
- name: valid
num_bytes: 6778969
num_examples: 1547
- name: test
num_bytes: 5972868
num_examples: 1548
download_size: 39958866
dataset_size: 97588829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
MikhailT/cmu-arctic | ---
license: mit
language:
- en
pretty_name: CMU Arctic
dataset_info:
features:
- name: speaker
dtype: string
- name: file
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: aew
num_bytes: 124532319
num_examples: 1132
- name: ahw
num_bytes: 65802249
num_examples: 593
- name: aup
num_bytes: 55771949
num_examples: 593
- name: awb
num_bytes: 106781643
num_examples: 1138
- name: axb
num_bytes: 67641455
num_examples: 593
- name: bdl
num_bytes: 97845496
num_examples: 1131
- name: clb
num_bytes: 123294691
num_examples: 1132
- name: eey
num_bytes: 55460671
num_examples: 592
- name: fem
num_bytes: 57115651
num_examples: 593
- name: gka
num_bytes: 64208369
num_examples: 592
- name: jmk
num_bytes: 103401609
num_examples: 1114
- name: ksp
num_bytes: 114080099
num_examples: 1132
- name: ljm
num_bytes: 51847413
num_examples: 593
- name: lnh
num_bytes: 120446549
num_examples: 1132
- name: rms
num_bytes: 127163811
num_examples: 1132
- name: rxr
num_bytes: 83873386
num_examples: 666
- name: slp
num_bytes: 72360869
num_examples: 593
- name: slt
num_bytes: 108798337
num_examples: 1132
download_size: 1577150976
dataset_size: 1600426566
size_categories:
- 10K<n<100K
---
# CMU Arctic Dataset |
kgr123/quality_counter_2500_4_simple | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 13952319
num_examples: 1929
- name: train
num_bytes: 13814105
num_examples: 1935
- name: validation
num_bytes: 14102516
num_examples: 1941
download_size: 9400115
dataset_size: 41868940
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Besedo/artificial_weapon | ---
annotations_creators:
- machine-generated
language: []
language_creators:
- machine-generated
license: []
multilinguality: []
pretty_name: artificial_weapon
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- weapon
- image
task_categories:
- image-classification
task_ids: []
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-54000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1055089
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Teklia/NorHand-v3-line | ---
license: mit
language:
- nb
task_categories:
- image-to-text
pretty_name: NorHand-v3-line
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_examples: 222381
- name: validation
num_examples: 22679
- name: test
num_examples: 1562
dataset_size: 246622
tags:
- atr
- htr
- ocr
- historical
- handwritten
---
# NorHand v3 - line level
## Table of Contents
- [NorHand v3 - line level](#norhand-v3-line-level)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
## Dataset Description
- **Homepage:** [Hugin-Munin project](https://hugin-munin-project.github.io/)
- **Source:** [Zenodo](https://zenodo.org/records/10255840)
- **Point of Contact:** [TEKLIA](https://teklia.com)
## Dataset Summary
The NorHand v3 dataset comprises Norwegian letter and diary line images and text from 19th and early 20th century.
Note that all images are resized to a fixed height of 128 pixels.
### Languages
All the documents in the dataset are written in Norwegian Bokmål.
## Dataset Structure
### Data Instances
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=4300x128 at 0x1A800E8E190,
'text': 'Til Bestyrelsen af'
}
```
### Data Fields
- `image`: a PIL.Image.Image object containing the image. Note that when accessing the image column (using dataset[0]["image"]), the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- `text`: the label transcription of the image. |
openerotica/lyric-analysis | ---
license: apache-2.0
---
This dataset was an attempt to reverse engineer song lyrics into training data using GPT-turbo. The datset was supposed to be much bigger, but I sufferd a catastrophic crash during the processing and was only able to recover a small portion. This is what I was able to salvage, and it still definitely needs some post processing. You might be better off just stating over from scratch, but I didn't want to throw this away if somebody can salvage it for something. |
royyanai/ddpm-butterflies-128 | ---
license: unknown
---
|
heliosprime/twitter_dataset_1713185931 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9910
num_examples: 24
download_size: 12684
dataset_size: 9910
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713185931"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jschoormans/humanpose_densepose | ---
license: bsd
dataset_info:
features:
- name: file_name
dtype: image
- name: conditioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1152541296.128
num_examples: 24984
download_size: 1063210762
dataset_size: 1152541296.128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
art-bashkirev/NTINeuroSci | ---
license: unknown
---
|
hiranb/testmathqa | ---
license: apache-2.0
---
|
datahrvoje/twitter_dataset_1713141974 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17128
num_examples: 40
download_size: 10320
dataset_size: 17128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sayan1101/model_v1_instruction_finetuning_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 27415192.0
num_examples: 52002
download_size: 12320134
dataset_size: 27415192.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "model_v1_instruction_finetuning_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnnikaSimonsen/combined_train_dataset_en-fo | ---
dataset_info:
features:
- name: File name
dtype: string
- name: English
dtype: string
- name: Faroese translation
dtype: string
splits:
- name: train
num_bytes: 11318248
num_examples: 105634
download_size: 7455201
dataset_size: 11318248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
foilfoilfoil/FireCheese | ---
license: other
---
|
Worldwars/caka | ---
license: cc0-1.0
---
|
monmamo/carmos | ---
license: cc
---
|
Falah/portrait_best_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 20785006
num_examples: 100000
download_size: 516227
dataset_size: 20785006
---
# Dataset Card for "portrait_best_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb-pt/amazon_reviews | ---
configs:
- config_name: pt-br
data_files:
- split: test
path: amazon_reviews_test_pt*
- split: train
path: train*
language:
- pt
--- |
alexshengzhili/SciGraphQA-295K-train | ---
license: mit
dataset_info:
features:
- name: image_file
dtype: string
- name: id
dtype: string
- name: caption
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: first_mention
dtype: string
- name: response
dtype: string
- name: title
dtype: string
- name: abstract
dtype: string
- name: q_a_pairs
sequence:
sequence: string
splits:
- name: train
num_bytes: 1586351961.3841674
num_examples: 295602
download_size: 770588612
dataset_size: 1586351961.3841674
---
# Dataset Card for Dataset Name
Here is a filled out dataset card for the SciGraphQA dataset:
\## Dataset Description
- **Homepage:** https://github.com/findalexli/SciGraphQA
- **Repository:** https://huggingface.co/datasets/alexshengzhili/SciGraphQA-295K-train
- **Paper:** https://arxiv.org/abs/2308.03349
- **Leaderboard:** N/A
- **Point of Contact Alex Li alex.shengzhi@gmail.com:**
\### Dataset Summary
SciGraphQA is a large-scale synthetic multi-turn question-answering dataset for scientific graphs. It contains 295K samples of open-vocabulary multi-turn question-answering dialogues about graphs from 290K academic papers. The dataset was created by using the Palm-2 API to generate dialogues conditioned on rich textual context including paper titles, abstracts, captions, paragraphs mentioning the figure.
\### Supported Tasks and Leaderboards
- Scientific graph question answering
- Visual question answering
- Multi-modal reasoning
Please see our paper for leaderboard
\### Languages
English
\## Dataset Structure
\### Data Instances
Each data instance contains:
- Paper title
- Paper abstract
- Figure caption
- Paragraph mentioning the figure
- Multi-turn question-answer conversation (2.23 turns on average)
\### Data Fields
- `title`: Paper title
- `abstract`: Paper abstract
- `caption`: Figure caption
- `paragraph`: Paragraph mentioning the figure
- `questions`: List of question strings
- `answers`: List of answer strings
\### Data Splits
- Training data: 295K samples
- Validation data: N/A
- Test data: 3K samples
\## Dataset Creation
\### Curation Rationale
This dataset was created to provide a large-scale benchmark for training and evaluating multi-modal models on scientific graph question answering.
\### Source Data
Figures, captions, paragraphs and metadata were sourced from 290K academic papers on ArXiv focused on Computer Science and Machine Learning.
\#### Initial Data Collection and Normalization
Figures were extracted using PDFFigures 2.0. Captions and paragraphs were extracted using regular expressions and heuristic rules.
\#### Who are the source language producers?
The source data consists of academic papers written in English by researchers in computer science and machine learning.
\### Annotations
\#### Annotation process
The multi-turn question-answer dialogues were generated using the Palm-2 conversational API conditioned on the sourced data context. The quality was validated by rating a subset with GPT-4.
\#### Who are the annotators?
The dialogues were automatically generated by Palm-2, an AI system developed by Anthropic.
\### Personal and Sensitive Information
The source academic papers may contain limited personal information about the authors such as name, affiliation, email. No other personal or sensitive information is included in this dataset.
\## Considerations for Using the Data
\### Social Impact of Dataset
This dataset presents minimal social risks since it contains only synthetic dialogues about scientific graphs and related metadata sourced from public academic papers.
\### Discussion of Biases
The dialogues reflect the characteristics and limitations of the Palm-2 system used to generate them. There may also be biases inherent in the academic source material.
\### Other Known Limitations
The dataset focuses specifically on computer science and machine learning papers. Performance on scientific graphs from other domains may differ.
\## Additional Information
\### Dataset Curators
Shengzhi Li, Nima Tajbakhsh
\### Licensing Information
This dataset is licensed under the MIT license.
\### Citation Information
```
@misc{li2023scigraphqa,
title={SciGraphQA: A Large-Scale Synthetic Multi-Turn Question-Answering Dataset for Scientific Graphs},
author={Shengzhi Li and Nima Tajbakhsh},
year={2023},
eprint={2308.03349},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
\### Contributions
We welcome contributions to improve the dataset! Please open an issue or pull request on the GitHub repository. |
CyberHarem/bremerton_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bremerton/ブレマートン/布莱默顿 (Azur Lane)
This is the dataset of bremerton/ブレマートン/布莱默顿 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `breasts, long_hair, pink_hair, bangs, multicolored_hair, streaked_hair, pink_eyes, large_breasts, twintails, hair_between_eyes, mole, grey_hair, hair_ornament, two-tone_hair, mole_under_eye, sidelocks, mole_on_breast`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 989.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bremerton_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 470.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bremerton_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1385 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/bremerton_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 832.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bremerton_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1385 | 1.69 GiB | [Download](https://huggingface.co/datasets/CyberHarem/bremerton_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bremerton_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, bare_shoulders, black_sweater, official_alternate_costume, sweater_dress, long_sleeves, looking_at_viewer, bra_strap, cleavage, off-shoulder_sweater, open_jacket, solo, strap_between_breasts, white_jacket, black_choker, black_ribbon, collarbone, off-shoulder_dress, star_print, blush, hair_intakes, hair_ribbon, smile, eyewear_hang, sunglasses, simple_background, standing, white_background, black_hairband, cowboy_shot, disposable_cup, holding, orange-tinted_eyewear, bubble_tea, no_mole, open_mouth, sitting, tongue, upper_body |
| 1 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, thighs, blue_hair, cleavage, collarbone, grin, indoors, short_sleeves, white_shirt, window, huge_breasts, navel, black_panties, teeth |
| 2 | 6 |  |  |  |  |  | bare_shoulders, blush, cleavage, collarbone, looking_at_viewer, smile, 1girl, huge_breasts, navel, solo, thighs, wet, bikini, sky, closed_mouth, night, water |
| 3 | 12 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, navel_piercing, pink_bikini, solo, collarbone, looking_at_viewer, black_shorts, short_shorts, simple_background, smile, white_background, belt, black_choker, thighhighs, thighs, lifebuoy_ornament, front-tie_bikini_top, highleg_bikini, nail_polish |
| 4 | 18 |  |  |  |  |  | 1girl, bikini_under_clothes, black_shorts, midriff, pink_bikini, blush, highleg_bikini, short_shorts, sunglasses, cleavage, lifebuoy_ornament, navel_piercing, solo, collarbone, crop_top_overhang, looking_at_viewer, black_choker, cowboy_shot, eyewear_on_head, grey_belt, hair_intakes, red-tinted_eyewear, smile, blue_jacket, long_sleeves, open_jacket, standing, two-tone_shirt, thigh_strap, thighhighs, cutoffs, side-tie_bikini_bottom, underboob, ear_piercing, groin, bare_shoulders, off_shoulder, snap-fit_buckle, simple_background, sky |
| 5 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, crop_top_overhang, hairclip, heart_necklace, looking_at_viewer, midriff, navel, official_alternate_costume, sleeveless_shirt, solo, tennis_uniform, two-tone_shirt, water_bottle, x_hair_ornament, blush, sweat, two-tone_skirt, chain-link_fence, holding, parted_lips, sitting, thighs, wristband, open_mouth, tennis_ball, tennis_racket, underboob, white_shirt, white_skirt |
| 6 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, hairclip, heart_necklace, looking_at_viewer, official_alternate_costume, sleeveless_shirt, solo, x_hair_ornament, collarbone, crop_top_overhang, see-through, simple_background, tennis_uniform, two-tone_shirt, white_background, bra, sweat, upper_body |
| 7 | 8 |  |  |  |  |  | 1girl, black_panties, blush, looking_at_viewer, official_alternate_costume, two-tone_shirt, two-tone_skirt, bare_shoulders, crop_top_overhang, day, hairclip, solo, tennis_uniform, x_hair_ornament, ass, chain-link_fence, outdoors, sleeveless_shirt, underboob, blue_sky, sweat, thighs, cloud, looking_back, from_behind, from_below, open_mouth, tennis_racket |
| 8 | 19 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, solo, cleavage, official_alternate_hairstyle, official_alternate_costume, thighs, bare_shoulders, white_thighhighs, red_hair, black_jacket, hair_down, stomach, sitting, teeth, grin, huge_breasts, navel_piercing, tank_top, bracelet, simple_background, white_background, black_skirt, nail_polish, shorts |
| 9 | 11 |  |  |  |  |  | 1girl, cleavage, solo, bare_shoulders, looking_at_viewer, white_dress, official_alternate_costume, white_thighhighs, wedding_dress, bridal_veil, flower, blush, collarbone, jewelry, red_hair, bouquet, detached_sleeves, full_body, garter_straps, grin, hair_intakes, red_ribbon, sitting |
| 10 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, necklace, school_uniform, blush, solo, white_shirt, collared_shirt, pleated_skirt, black_skirt, cardigan, smile, collarbone, holding, open_mouth, simple_background, white_background, bra_peek, piercing, pink_bra, thighs, cowboy_shot, hair_intakes, miniskirt, sitting, smartphone |
| 11 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, blush, china_dress, cleavage, double_bun, eyewear_on_head, official_alternate_costume, pelvic_curtain, round_eyewear, solo, black_thighhighs, bridal_gauntlets, looking_at_viewer, thighs, covered_navel, sideboob, hair_intakes, braid, highleg, nail_polish, standing_on_one_leg, grin, parted_lips, underwear |
| 12 | 22 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, china_dress, cleavage, double_bun, eyewear_on_head, hair_intakes, official_alternate_costume, round_eyewear, solo, pelvic_curtain, bra_peek, bridal_gauntlets, looking_at_viewer, sleeveless_dress, strapless_bra, braided_bun, sunglasses, black_thighhighs, tinted_eyewear, cowboy_shot, blush, brown_bra, covered_navel, standing_on_one_leg, skindentation, hair_ribbon, open_mouth, simple_background, thighs, white_background, nail_polish, panties, sideboob |
| 13 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, penis, sex, solo_focus, vaginal, huge_breasts, spread_legs, sweat, collarbone, mosaic_censoring, on_back, hair_intakes, navel_piercing, stomach, tongue_out, bed_sheet, choker, completely_nude, cum_in_pussy, heavy_breathing, looking_at_viewer, pillow, tears, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_sweater | official_alternate_costume | sweater_dress | long_sleeves | looking_at_viewer | bra_strap | cleavage | off-shoulder_sweater | open_jacket | solo | strap_between_breasts | white_jacket | black_choker | black_ribbon | collarbone | off-shoulder_dress | star_print | blush | hair_intakes | hair_ribbon | smile | eyewear_hang | sunglasses | simple_background | standing | white_background | black_hairband | cowboy_shot | disposable_cup | holding | orange-tinted_eyewear | bubble_tea | no_mole | open_mouth | sitting | tongue | upper_body | thighs | blue_hair | grin | indoors | short_sleeves | white_shirt | window | huge_breasts | navel | black_panties | teeth | wet | bikini | sky | closed_mouth | night | water | navel_piercing | pink_bikini | black_shorts | short_shorts | belt | thighhighs | lifebuoy_ornament | front-tie_bikini_top | highleg_bikini | nail_polish | bikini_under_clothes | midriff | crop_top_overhang | eyewear_on_head | grey_belt | red-tinted_eyewear | blue_jacket | two-tone_shirt | thigh_strap | cutoffs | side-tie_bikini_bottom | underboob | ear_piercing | groin | off_shoulder | snap-fit_buckle | hairclip | heart_necklace | sleeveless_shirt | tennis_uniform | water_bottle | x_hair_ornament | sweat | two-tone_skirt | chain-link_fence | parted_lips | wristband | tennis_ball | tennis_racket | white_skirt | see-through | bra | day | ass | outdoors | blue_sky | cloud | looking_back | from_behind | from_below | official_alternate_hairstyle | white_thighhighs | red_hair | black_jacket | hair_down | stomach | tank_top | bracelet | black_skirt | shorts | white_dress | wedding_dress | bridal_veil | flower | jewelry | bouquet | detached_sleeves | full_body | garter_straps | red_ribbon | necklace | school_uniform | collared_shirt | pleated_skirt | cardigan | bra_peek | piercing | pink_bra | miniskirt | smartphone | black_dress | china_dress | double_bun | pelvic_curtain | round_eyewear | black_thighhighs | bridal_gauntlets | covered_navel | sideboob | braid | highleg | standing_on_one_leg | underwear | sleeveless_dress | strapless_bra | braided_bun | tinted_eyewear | brown_bra | skindentation | panties | 1boy | hetero | nipples | penis | sex | solo_focus | vaginal | spread_legs | mosaic_censoring | on_back | tongue_out | bed_sheet | choker | completely_nude | cum_in_pussy | heavy_breathing | pillow | tears |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:----------------|:-----------------------------|:----------------|:---------------|:--------------------|:------------|:-----------|:-----------------------|:--------------|:-------|:------------------------|:---------------|:---------------|:---------------|:-------------|:---------------------|:-------------|:--------|:---------------|:--------------|:--------|:---------------|:-------------|:--------------------|:-----------|:-------------------|:-----------------|:--------------|:-----------------|:----------|:------------------------|:-------------|:----------|:-------------|:----------|:---------|:-------------|:---------|:------------|:-------|:----------|:----------------|:--------------|:---------|:---------------|:--------|:----------------|:--------|:------|:---------|:------|:---------------|:--------|:--------|:-----------------|:--------------|:---------------|:---------------|:-------|:-------------|:--------------------|:-----------------------|:-----------------|:--------------|:-----------------------|:----------|:--------------------|:------------------|:------------|:---------------------|:--------------|:-----------------|:--------------|:----------|:-------------------------|:------------|:---------------|:--------|:---------------|:------------------|:-----------|:-----------------|:-------------------|:-----------------|:---------------|:------------------|:--------|:-----------------|:-------------------|:--------------|:------------|:--------------|:----------------|:--------------|:--------------|:------|:------|:------|:-----------|:-----------|:--------|:---------------|:--------------|:-------------|:-------------------------------|:-------------------|:-----------|:---------------|:------------|:----------|:-----------|:-----------|:--------------|:---------|:--------------|:----------------|:--------------|:---------|:----------|:----------|:-------------------|:------------|:----------------|:-------------|:-----------|:-----------------|:-----------------|:----------------|:-----------|:-----------|:-----------|:-----------|:------------|:-------------|:--------------|:--------------|:-------------|:-----------------|:----------------|:-------------------|:-------------------|:----------------|:-----------|:--------|:----------|:----------------------|:------------|:-------------------|:----------------|:--------------|:-----------------|:------------|:----------------|:----------|:-------|:---------|:----------|:--------|:------|:-------------|:----------|:--------------|:-------------------|:----------|:-------------|:------------|:---------|:------------------|:---------------|:------------------|:---------|:--------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | | | | | X | | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | | | X | | X | | | X | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | | | | | X | | X | | | X | | | X | | X | | | X | | | X | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 18 |  |  |  |  |  | X | X | | | | X | X | | X | | X | X | | | X | | X | | | X | X | | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | | | X | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | X | | | X | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | X | | | | | X | | X | X | | X | X | X | X | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 19 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | X | | | X | | | | | | X | | X | | | | | | | | | X | | | X | | X | | | | | X | | | X | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 11 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | X | | | X | X | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 10 |  |  |  |  |  | X | | | | | | X | | X | | | X | | | | | X | | | X | X | | X | | | X | | X | | X | | X | | | | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 8 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 22 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | | | | X | X | X | | | X | X | | X | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 13 | 8 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.