datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jijivski/metaculus_binary | ---
license: apache-2.0
---
|
dnagpt/human_genome_GCF_009914755.1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1032653672
num_examples: 989132
- name: test
num_bytes: 10431648
num_examples: 9992
download_size: 472762984
dataset_size: 1043085320
license: apache-2.0
---
# Dataset Card for "human_genome_GCF_009914755.1"
how to build this data:
human full genome data from:
https://www.ncbi.nlm.nih.gov/datasets/genome/GCF_009914755.1/
Preprocess:
1 download data use ncbi data set tools:
curl -o datasets 'https://ftp.ncbi.nlm.nih.gov/pub/datasets/command-line/LATEST/linux-amd64/datasets'
chmod +x datasets
./datasets download genome accession GCF_000001405.40 --filename genomes/human_genome_dataset.zip
then move the gene data to human2.fra
2 write the origin data into pure dna data, one line 1000 bp/letters:
```
filename = "human2.fna"
data_file = open(filename, 'r')
out_filename = "human2.fna.line"
out_file = open(out_filename, 'w')
max_line_len = 1000 #1000个字母一行数据
text = ""
for line in data_file:
line = line.strip()
if line.find(">") != 0: #去掉标题行
line = line.upper()
line = line.replace(" ","").replace("N","") #去掉N和空格
text = text + line
if len(text) > max_line_len:
text = text.strip()
out_file.write(text+"\n")
text = "" #clear text
#last line
if len(text) <= max_line_len:
pass #不要了
3 split data into train and valid dataset:
```
filename = "human2.fna.line"
data_file = open(filename, 'r')
out_train_filename = "human2.fna.line.train"
out_train_file = open(out_train_filename, 'w')
out_valid_filename = "human2.fna.line.valid"
out_valid_file = open(out_valid_filename, 'w')
line_num = 0
select_line_num = 0
for line in data_file:
if 0==line_num%3: #取1/3数据
if select_line_num%100: #取1%做校验
out_train_file.write(line)
else:
out_valid_file.write(line)
select_line_num = select_line_num + 1
line_num = line_num + 1
4 the we could use it in local, or push it to hub:
```
from datasets import load_dataset
data_files = {"train": "human2.fna.line.train", "test": "human2.fna.line.valid"}
train_dataset = load_dataset("text", data_files=data_files)
train_dataset.push_to_hub("dnagpt/human_genome_GCF_009914755.1",token="hf_*******")
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/testing_codealpaca_small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 31503
num_examples: 100
- name: test
num_bytes: 29802
num_examples: 100
download_size: 44006
dataset_size: 61305
---
# Dataset Card for "testing_codealpaca_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/ocr-generated-machine-readable-zone-mrz-text-detection | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-to-text
- object-detection
language:
- en
tags:
- code
- legal
---
# OCR GENERATED Machine-Readable Zone (MRZ) Text Detection
The dataset includes a collection of **GENERATED** photos containing Machine Readable Zones (MRZ) commonly found on identification documents such as passports, visas, and ID cards. Each photo in the dataset is accompanied by **text detection** and **Optical Character Recognition (OCR)** results.
This dataset is useful for developing applications related to *document verification, identity authentication, or automated data extraction from identification documents*.
### The dataset is solely for informational or educational purposes and should not be used for any fraudulent or deceptive activities.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/ocr-machine-readable-zone-mrz?utm_source=huggingface&utm_medium=cpc&utm_campaign=ocr-generated-machine-readable-zone-mrz-text-detection) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images of documents
- **boxes** - includes bounding box labeling for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and detected text, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes and detected text . For each point, the x and y coordinates are provided.
# Example of XML file structure
.png?generation=1694514503035476&alt=media)
# Text Detection in the Documents might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/ocr-machine-readable-zone-mrz?utm_source=huggingface&utm_medium=cpc&utm_campaign=ocr-generated-machine-readable-zone-mrz-text-detection) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
tinkerface/Minerb | ---
license: apache-2.0
---
|
lowem1/ocr-bert_cms-vocab | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10038136
num_examples: 925113
download_size: 1633020
dataset_size: 10038136
---
# Dataset Card for "ocr-bert_cms-vocab"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thmk/java_10 | ---
dataset_info:
features:
- name: code
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
splits:
- name: train
num_bytes: 605825109
num_examples: 100000
download_size: 195428485
dataset_size: 605825109
---
# Dataset Card for "java_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harpomaxx/jurisgpt | ---
license: openrail
configs:
- config_name: default
data_files:
- split: train
path: "train_set.json"
- split: test
path: "test_set.json"
---
### Dataset Description
#### Title
**Legal Texts and Summaries Dataset**
#### Description
This dataset is a collection of legal documents and their associated summaries, subjects (materia), and keywords (voces). It is primarily focused on the field of labor law, with particular emphasis on legal proceedings, labor rights, and workers' compensation laws in Argentina.
#### Structure
Each entry in the dataset contains the following fields:
- `sumario`: A unique identifier for the legal document.
- `materia`: The subject of the legal document, in this case, "DERECHO DEL TRABAJO" (Labor Law).
- `voces`: Keywords or phrases summarizing the main topics of the document, such as "FALLO PLENARIO", "DERECHO LABORAL", "LEY SOBRE RIESGOS DEL TRABAJO", etc.
- `sentencia`: The text of the legal document, which includes references to laws, legal precedents, and detailed analysis. The text was summarized using Claude v2 LLM.
- 'texto': A legal summary.
#### Applications
This dataset is valuable for legal research, especially in the domain of labor law. It can be used for training models in legal text summarization, keyword extraction, and legal document classification. Additionally, it's useful for academic research in legal studies, especially regarding labor law and workers' compensation in Argentina.
#### Format
The dataset is provided in JSON format, ensuring easy integration with most data processing and machine learning tools.
#### Language
The content is predominantly in Spanish, reflecting its focus on Argentine law.
#### Source and Authenticity
The data is compiled from official legal documents and summaries from Argentina. It's important for users to verify the authenticity and current relevance of the legal texts as they might have undergone revisions or may not reflect the latest legal standings.
|
irds/lotte_pooled_dev_search | ---
pretty_name: '`lotte/pooled/dev/search`'
viewer: false
source_datasets: ['irds/lotte_pooled_dev']
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/pooled/dev/search`
The `lotte/pooled/dev/search` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/pooled/dev/search).
# Data
This dataset provides:
- `queries` (i.e., topics); count=2,931
- `qrels`: (relevance assessments); count=8,573
- For `docs`, use [`irds/lotte_pooled_dev`](https://huggingface.co/datasets/irds/lotte_pooled_dev)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/lotte_pooled_dev_search', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/lotte_pooled_dev_search', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
jmc255/aphantasia_drawing_dataset | ---
language:
- en
tags:
- medical
- psychology
pretty_name: Aphantasic Drawing Dataset
---
# Aphantasic Drawing Dataset
<!-- Provide a quick summary of the dataset. -->
This dataset contains data from an online memory drawing experiment conducted with individuals with aphantasia and normal imagery.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This dataset comes from the Brain Bridge Lab from the University of Chicago.
It is from an online memory drawing experiment with 61 individuals with aphantasia
and 52 individuals with normal imagery. In the experiment participants 1) studied 3
separate scene photographs presented one after the other 2) drew them from memory, 3)
completed a recognition task 4) copied the images while viewing them 5) filled out a VVIQ
and OSIQ questionnaire and also demographics questions. The scenes the participants were asked
to draw were of a kitchen, bedroom, and living room. The control (normal imagery) and
treatment group (aphantasia) were determined by VVIQ scores. Those with a score >=40 were control
and those with scores <=25 were in the aphantasia group. For more info on the experiment and design
follow the paper linked below.
The original repository for the data from the experiment
was made available on the OSF website linked below.
It was created July 31, 2020 and last updated September 27, 2023.
- **Curated by:** Wilma Bainbridge, Zoe Pounder, Alison Eardley, Chris Baker
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Original Repository:** https://osf.io/cahyd/
- **Paper:** https://doi.org/10.1016/j.cortex.2020.11.014
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Example of the structure of the data:
```
{
"subject_id": 111,
"treatment": "aphantasia",
"demographics": {
"country": "United States",
"age": 56,
"gender": "male",
"occupation": "sales/music",
"art_ability": 3,
"art_experience": "None",
"device": "desktop",
"input": "mouse",
"difficult": "memorydrawing",
"diff_explanation": "drawing i have little patience for i can do it but i will spend hours editing (never got the hang of digital drawing on screens",
"vviq_score": 16,
"osiq_score": 61
},
"drawings": {
"kitchen": {
"perception": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>,
"memory": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>
},
"livingroom": {
"perception": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>,
"memory": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>
},
"bedroom": {
"perception": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>,
"memory": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>
}
},
"image": {
"kitchen": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>,
"livingroom": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>,
"bedroom": <PIL.PngImagePlugin.PngImageFile image mode=RGB size=500x500>
}
}
```
## Data Fields
- subject_id: Subject ID
- treatment: group for the experiment based on VVIQ score; "aphantasia"(VVIQ<=25) or "control"(VVIQ>=40)
- country: Participants Country
- age: Age
- gender: Gender
- occupation: Occupation
- art_ability: Self-Report rating of art ability from 1-5
- art_experience: Description of participant's art experience
- device: Device used for the experiment (desktop or laptop)
- input: What participant used to draw (mouse, trackpad, etc.)
- difficult: Part of the experiment participant though was most challenging
- diff_explanation: Explanation of why they thought the part specified in "difficult" was hard
- vviq_score: VVIQ test total points (16 questions from 0-5)
- osiq_score: OSIQ test total points (30 questions from 0-5)
- perception drawings: Participant drawings of kitchen, living room, and bedroom from perception part of experiment
- memory drawings: Participant drawings of kitchen, living room, and bedroom from memory part of experiment
- image: Actual images (stimuli) participants had to draw (kitchen, living room, bedroom)
## Dataset Creation
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
The orignal repository for the data included a folder for the participant's drawings (1 for aphantasia and 1 for control),
a folder of the scene images they had to draw (stimuli), and a excel file with the demographic survey and test
score data.
The excel file had 117 rows and the 2 folders for aphantasia and control had 115 subject folders total.
Subject 168 did not fill out the demographic information or take the VVIQ and OSIQ test and
subjects 160, 161, and 162 did not have any drawings. These 4 were removed during the data
cleaning process and the final JSON has 114 participants.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The original excel file did not have total score for the VVIQ and OSIQ tests, but instead
individual points for each questions. The total score was calculated from these numbers.
The orignal drawing folders for each participant had typically 18 files. There were 6 files
of interest: The 3 drawings from the memory part of the experiment, and the 3 drawings from
the perception part of the experiment. They had file names that were easy to distinguish:
For memory: sub{subid}-mem{1,2,or 3}-{room}.png where the room was either livingroom, bedroom, or kitchen
and 1,2, or 3 depending on the order in which the participant did the drawings.
For perception: sub{subid}-pic{1,2,or 3}-{room}.png
These files were matched with the excel file rows by subject ID,
so each participant typically had 6 drawings total (some participants
did not label what they were drawing and the file name was not in the normal
format and therefore did not have all their drawings).
The actual image folder had 3 images (kitchen, living room, bedroom) that
were replicated to go with each of the 114 participants.
The final format of the data is linked above.
#### Example Analysis
https://colab.research.google.com/drive/1FFnVkaw-jr3ygEGDK41YyAyKlqvTDvkg?usp=sharing
#### Data Producers
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
The contributors of the original dataset and authors of the paper are:
- Wilma Bainbridge (University of Chicago Department of Psychology & Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA),
- Zoë Pounder (Department of Psychology, University of Westminster, London, UK)
- Alison Eardley (Department of Psychology, University of Westminster, London, UK)
- Chris Baker (Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA)
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{Bainbridge_Pounder_Eardley_Baker_2023,
title={Quantifying Aphantasia through drawing: Those without visual imagery show deficits in object but not spatial memory},
url={osf.io/cahyd},
publisher={OSF},
author={Bainbridge, Wilma A and Pounder, Zoë and Eardley, Alison and Baker, Chris I},
year={2023},
month={Sep}
}
**APA:**
Bainbridge, W. A., Pounder, Z., Eardley, A., & Baker, C. I. (2023, September 27). Quantifying Aphantasia through drawing: Those without visual imagery show deficits in object but not spatial memory. Retrieved from osf.io/cahyd
## Glossary
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
Aphantasia: The inability to create visual imagery, keeping you from visualizing things in your mind.
|
sabuhi1997/fine-tune-hebrew-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': validation
splits:
- name: train
num_bytes: 5714802.0
num_examples: 8
- name: validation
num_bytes: 1759819.0
num_examples: 3
- name: test
num_bytes: 1625529.0
num_examples: 4
download_size: 7719156
dataset_size: 9100150.0
---
# Dataset Card for "fine-tune-hebrew-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/comet_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of comet/コメット/彗星 (Azur Lane)
This is the dataset of comet/コメット/彗星 (Azur Lane), containing 34 images and their tags.
The core tags of this character are `green_hair, long_hair, red_eyes, twintails, ahoge, hat, bangs, beret, breasts, hair_between_eyes, hair_ornament, white_headwear, ribbon, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 41.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 25.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 77 | 50.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 36.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 77 | 67.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/comet_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | smile, 1girl, solo, open_mouth, star_(symbol), blush, choker, looking_at_viewer, puffy_sleeves, white_thighhighs, blue_skirt, plaid_skirt, white_shirt, long_sleeves, collared_shirt, one_eye_closed, ;d, hair_ribbon, pleated_skirt, retrofit_(azur_lane), white_background |
| 1 | 5 |  |  |  |  |  | 2girls, smile, 1girl, blush, looking_at_viewer, open_mouth, solo_focus, blonde_hair, thighhighs, collarbone, one_eye_closed, skirt, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | smile | 1girl | solo | open_mouth | star_(symbol) | blush | choker | looking_at_viewer | puffy_sleeves | white_thighhighs | blue_skirt | plaid_skirt | white_shirt | long_sleeves | collared_shirt | one_eye_closed | ;d | hair_ribbon | pleated_skirt | retrofit_(azur_lane) | white_background | 2girls | solo_focus | blonde_hair | thighhighs | collarbone | skirt | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:----------------|:--------|:---------|:--------------------|:----------------|:-------------------|:-------------|:--------------|:--------------|:---------------|:-----------------|:-----------------|:-----|:--------------|:----------------|:-----------------------|:-------------------|:---------|:-------------|:--------------|:-------------|:-------------|:--------|:-----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | | X | | | | | | X | X | X | X | X | X | X |
|
tessiw/german_OpenOrca2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 453043119
num_examples: 250000
download_size: 257694182
dataset_size: 453043119
---
# Dataset Card for "german_OpenOrca2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zaydzuhri/the_pile_tokenized_5percent | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 46366029035
num_examples: 6000000
download_size: 16007372812
dataset_size: 46366029035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Eathan/labeled-ortho-training | ---
license: unknown
---
|
junyinc/NINJAL-Ainu-Folklore | ---
license: cc-by-sa-4.0
---
# Dataset Card for NINJAL Ainu Folklore
## Dataset Description
- **Original source** [A Glossed Audio Corpus of Ainu folklore](https://ainu.ninjal.ac.jp/folklore/en/)
### Dataset Summary
Ainu is an endangered (nearly extinct) language spoken in Hokkaido, Japan. This dataset contains recordings of 38 traditional Ainu folktales by two Ainu speakers (Mrs. Kimi Kimura and Mrs. Ito Oda), along with their transcriptions (in Latin script), English translations, and underlying and surface gloss forms in English. (For transcriptions in Katakana and translation/gloss in Japanese, please see the original corpus webpage.) In total, there are over 8 hours (~7.7k sentences) of transcribed and glossed speech.
### Annotations
The glosses in this dataset are the original glosses from the Glossed Audio Corpus, with minor changes to fit the Generalized Glossing Format (e.g. multi-word translations of individual morphemes are now separated by underscores instead of periods). Uncertainty in interpretation by the original annotators is indicated with a question mark (?). Additional notes on the Latin transcriptions in the corpus can be found on the original corpus webpage (under the "Structure, Transcriptions, and Glosses" tab).
## Additional Information
### Limitations
This dataset has a small number of speakers and a limited domain, and models trained on this dataset might not be suitable for general purpose applications. The audio data contain varying degrees of noise which makes this dataset a poor fit for training TTS models.
### Acknowledgement
We would like to thank the original authors of the Glossed Audio Corpus of Ainu Folklore for their dedication and care in compiling these resources, and kindly ask anyone who uses this dataset to cite them in their work.
### License
Attribution-ShareAlike 4.0 International ([cc-by-sa-4.0](https://creativecommons.org/licenses/by-sa/4.0/))
### Original Source
```
@misc{ninjal-ainu-folklore,
title={A Glossed Audio Corpus of Ainu Folklore},
url={https://ainu.ninjal.ac.jp/folklore/},
author={Nakagawa, Hiroshi and Bugaeva, Anna and Kobayashi, Miki and Yoshikawa, Yoshimi},
publisher={The National Institute for Japanese Language and Linguistics ({NINJAL})},
date={2016--2021}
}
``` |
liuyanchen1015/MULTI_VALUE_mnli_shadow_pronouns | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 200969
num_examples: 714
- name: dev_mismatched
num_bytes: 242835
num_examples: 909
- name: test_matched
num_bytes: 235033
num_examples: 840
- name: test_mismatched
num_bytes: 221353
num_examples: 866
- name: train
num_bytes: 9073404
num_examples: 33730
download_size: 6082221
dataset_size: 9973594
---
# Dataset Card for "MULTI_VALUE_mnli_shadow_pronouns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HarryAJMK418/Joyner | ---
license: openrail
---
|
pradeep239/plain_philp_250PDFs | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 620972376.064
num_examples: 1272
- name: validation
num_bytes: 75497524.0
num_examples: 150
- name: test
num_bytes: 37954461.0
num_examples: 75
download_size: 542872519
dataset_size: 734424361.064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
one-sec-cv12/chunk_26 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 16222987440.875
num_examples: 168905
download_size: 14476054976
dataset_size: 16222987440.875
---
# Dataset Card for "chunk_26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sunjoong/adit-testdata-01 | ---
license: unknown
---
|
deepapaikar/Katzbot_sentences_pairs | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-2.2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T06:36:10.303707](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1/blob/main/results_2023-10-28T06-36-10.303707.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04467281879194631,\n\
\ \"em_stderr\": 0.0021156186992613577,\n \"f1\": 0.10597210570469756,\n\
\ \"f1_stderr\": 0.0024082864478827287,\n \"acc\": 0.438030054339078,\n\
\ \"acc_stderr\": 0.010411282060464109\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.04467281879194631,\n \"em_stderr\": 0.0021156186992613577,\n\
\ \"f1\": 0.10597210570469756,\n \"f1_stderr\": 0.0024082864478827287\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \
\ \"acc_stderr\": 0.008820485491442476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T06_36_10.303707
path:
- '**/details_harness|drop|3_2023-10-28T06-36-10.303707.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T06-36-10.303707.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T06_36_10.303707
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-36-10.303707.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-36-10.303707.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T06_36_10.303707
path:
- '**/details_harness|winogrande|5_2023-10-28T06-36-10.303707.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T06-36-10.303707.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- results_2023-10-01T13-47-59.401032.parquet
- split: 2023_10_28T06_36_10.303707
path:
- results_2023-10-28T06-36-10.303707.parquet
- split: latest
path:
- results_2023-10-28T06-36-10.303707.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-2.2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T06:36:10.303707](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1/blob/main/results_2023-10-28T06-36-10.303707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04467281879194631,
"em_stderr": 0.0021156186992613577,
"f1": 0.10597210570469756,
"f1_stderr": 0.0024082864478827287,
"acc": 0.438030054339078,
"acc_stderr": 0.010411282060464109
},
"harness|drop|3": {
"em": 0.04467281879194631,
"em_stderr": 0.0021156186992613577,
"f1": 0.10597210570469756,
"f1_stderr": 0.0024082864478827287
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.008820485491442476
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yalkgam/talmudbavli | ---
license: mit
---
|
mespinosami/map2sat-edi5k20-samples | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 453584.8
num_examples: 16
- name: test
num_bytes: 125145.2
num_examples: 4
download_size: 596502
dataset_size: 578730.0
---
# Dataset Card for "map2sat-edi5k20-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sowmya15/English_Profanity_Full | ---
license: apache-2.0
---
|
benayas/massive_llm_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: partition
dtype: string
- name: scenario
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: intent
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: utt
dtype: string
- name: annot_utt
dtype: string
- name: worker_id
dtype: string
- name: slot_method
sequence:
- name: slot
dtype: string
- name: method
dtype: string
- name: judgments
sequence:
- name: worker_id
dtype: string
- name: intent_score
dtype: int8
- name: slots_score
dtype: int8
- name: grammar_score
dtype: int8
- name: spelling_score
dtype: int8
- name: language_identification
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 17839343
num_examples: 11514
- name: validation
num_bytes: 3144099
num_examples: 2033
- name: test
num_bytes: 4598528
num_examples: 2974
download_size: 2975271
dataset_size: 25581970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
pavanBuduguppa/asr_inverse_text_normalization | ---
license: gpl-3.0
---
|
result-kand2-sdxl-wuerst-karlo/e8491cc1 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 168
num_examples: 10
download_size: 1314
dataset_size: 168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e8491cc1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weijiawu/DSText | ---
license: cc-by-4.0
---
|
fiatrete/dan-used-apps | ---
license: mit
---
|
JelleWo/common_voice_13_0_en_VALTEST_pseudo_labelled | ---
dataset_info:
config_name: en
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: validation
num_bytes: 716137947.988
num_examples: 16372
- name: test
num_bytes: 725460387.324
num_examples: 16372
download_size: 1408503884
dataset_size: 1441598335.312
configs:
- config_name: en
data_files:
- split: validation
path: en/validation-*
- split: test
path: en/test-*
---
|
DiegoRoberto10/diego | ---
license: openrail
---
|
jonathan-roberts1/Million-AID | ---
dataset_info:
features:
- name: image
dtype: image
- name: label_1
dtype:
class_label:
names:
'0': unutilized land
'1': commercial land
'2': public service land
'3': transportation land
'4': industrial land
'5': water area
'6': residential land
'7': agriculture land
- name: label_2
dtype:
class_label:
names:
'0': dam
'1': religious land
'2': rock land
'3': sparse shrub land
'4': arable land
'5': factory area
'6': detached house
'7': desert
'8': lake
'9': power station
'10': beach
'11': ice land
'12': bare land
'13': island
'14': woodland
'15': mobile home park
'16': railway area
'17': river
'18': grassland
'19': apartment
'20': special land
'21': port area
'22': commercial area
'23': highway area
'24': mining area
'25': sports land
'26': airport area
'27': leisure land
- name: label_3
dtype:
class_label:
names:
'0': dam
'1': parking lot
'2': greenhouse
'3': pier
'4': bridge
'5': mine
'6': rock land
'7': baseball field
'8': apron
'9': tennis court
'10': sparse shrub land
'11': works
'12': oil field
'13': meadow
'14': ground track field
'15': detached house
'16': golf course
'17': forest
'18': desert
'19': lake
'20': beach
'21': paddy field
'22': ice land
'23': bare land
'24': storage tank
'25': basketball court
'26': island
'27': substation
'28': mobile home park
'29': cemetery
'30': quarry
'31': solar power plant
'32': helipad
'33': roundabout
'34': runway
'35': wastewater plant
'36': river
'37': apartment
'38': dry field
'39': intersection
'40': swimming pool
'41': commercial area
'42': church
'43': road
'44': orchard
'45': terraced field
'46': stadium
'47': train station
'48': railway
'49': viaduct
'50': wind turbine
splits:
- name: train
num_bytes: 871962498
num_examples: 10000
download_size: 871644115
dataset_size: 871962498
license: other
task_categories:
- image-classification
- zero-shot-image-classification
---
# Dataset Card for "Million-AID"
## Dataset Description
- **Paper** [On creating benchmark dataset for aerial image interpretation: Reviews, guidances, and million-aid](https://ieeexplore.ieee.org/iel7/4609443/9314330/09393553.pdf)
- **Split** Train
## Split Information
This HuggingFace dataset repository contains just the Train split.
### Licensing Information
[CC BY-NC-ND 4.0](https://competitions.codalab.org/competitions/35974#learn_the_details-terms-and-conditions)
## Citation Information
[On creating benchmark dataset for aerial image interpretation: Reviews, guidances, and million-aid](https://ieeexplore.ieee.org/iel7/4609443/9314330/09393553.pdf)
```
@article{long2021creating,
title = {On creating benchmark dataset for aerial image interpretation: Reviews, guidances, and million-aid},
author = {Long, Yang and Xia, Gui-Song and Li, Shengyang and Yang, Wen and Yang, Michael Ying and Zhu, Xiao Xiang and Zhang, Liangpei and Li, Deren},
year = 2021,
journal = {IEEE Journal of selected topics in applied earth observations and remote sensing},
publisher = {IEEE},
volume = 14,
pages = {4205--4230}
}
``` |
Nexdata/Minnan_Dialect_Pronunciation_Dictionary | ---
task_categories:
- automatic-speech-recognition
---
# Dataset Card for Nexdata/Minnan_Dialect_Pronunciation_Dictionary
## Description
Each entry consists of three parts: words, pinyin, and tones. The dictionary can be used to provide pronunciation reference for sound recording personnel, research, and development of pronunciation recognition technology, etc.
For more details, please refer to the link: https://www.nexdata.ai/datasets/51?source=Huggingface
# Specifications
## Format
txt
## Content
87,166 Minnan dialect words and corresponding phonetic symbols.
## Language
Minnan dialect
## Application scenario
speech recognition
# Licensing Information
Commercial License |
ResplendentAI/Synthetic_Soul_1k | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- philosophy
- psychology
pretty_name: Synthetic Soul 1k
size_categories:
- n<1K
---
This is a semi-synthetic dataset generated using RAG based on my collected writings over a ten year period of isolation. This dataset may be useful for therapeutic purposes aas well as imparting a philospophical or psychological slant to deep conversations. |
najju/aslgpc-psl | ---
dataset_info:
features:
- name: gloss
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 13475111
num_examples: 87710
download_size: 7583458
dataset_size: 13475111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TejSinguluri/rmt_anno_donut_3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 224420193.0
num_examples: 308
- name: validation
num_bytes: 53826227.0
num_examples: 78
download_size: 243397602
dataset_size: 278246420.0
---
# Dataset Card for "rmt_anno_donut_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Defalt-404/Bittensor_relative_QA | ---
task_categories:
- text-generation
tags:
- dataset
- bittensor
- gpt4
- prompt
- response
pretty_name: Bittensor_netuid_1
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is aims for bittensor subnet 1 model. It contains around 3K record and it has group of 3 corresponding questions and answers in jsonl file format.
Most of the unicode charecter is filtered out but some are there to add noise in the training data.
## Dataset Creation
### Source Data [https://huggingface.co/datasets/mrseeker87/bittensor_qa/]
## Contact [https://github.com/Kunj-2206] |
Birchlabs/openai-prm800k-phase2_train-solutions-only | ---
license: mit
---
|
EnD-Diffusers/v1_DuskfallCrewArtStyle_Lora | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- Art Style
- duskfallcrew
pretty_name: Duskfallcrew Art Style Dataset & Lora
size_categories:
- 1K<n<10K
---
# Dataset Card for DuskfallCrewArtStyle_Lora
## Dataset Description
- **Homepage:https://duskfallcrew.carrd.co/**
- **Point of Contact: See the Carrd website for contact info, or DM us on HF**
### Dataset Summary
This data set is the basis for the LoRa that is in this repository.
### Supported Tasks and Leaderboards
Text to Image / Stable Diffusion/ LoRA
### Languages
English
### Source Data
### Personal and Sensitive Information
This is based on our own Art, and while we're A OK for you to use it, you don't own the art within the dataset, but you may not care to anyways.
## Considerations for Using the Data
### Social Impact of Dataset
Shitty Art!
### Discussion of Biases
It largely has non binary features, not sure if it has any one specific gender. We have Dissociative identity disorder so laregely the faces in here are either alters in our system or other systems we've done art for.
### Other Known Limitations
SHITTYART!
## Additional Information
### Licensing Information
While it's under the lisc listed, we do ask you that you don't resell the dataset. You're responsible for your use of the dataset, and the faces within it. Your outputs are up to you.
### Citation Information
If you use the dataset, citation is nice, but it'd be even nicer if you gave us coffee! https://ko-fi.com/DUSKFALLcrew
|
sunfu-chou/wine_review | ---
dataset_info:
features:
- name: wine_id
dtype: int64
- name: country
dtype: string
- name: description
dtype: string
- name: designation
dtype: string
- name: points
dtype: int64
- name: price
dtype: float64
splits:
- name: train
num_bytes: 21093175.17523332
num_examples: 68918
- name: test
num_bytes: 5273446.824766681
num_examples: 17230
download_size: 15007824
dataset_size: 26366622.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yuvalkirstain/pexel_images_lots | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2466579957.125
num_examples: 7999
download_size: 2418558487
dataset_size: 2466579957.125
---
# Dataset Card for "pexel_images_lots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Duongkum999/duong | ---
license: mit
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jxie/emg | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float64
- name: label
dtype: int64
splits:
- name: val
num_bytes: 738492
num_examples: 41
- name: train
num_bytes: 2197464
num_examples: 122
- name: test
num_bytes: 738492
num_examples: 41
download_size: 472145
dataset_size: 3674448
---
# Dataset Card for "emg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tristan/olm-october-2022-tokenized-1024-perplexity-filters | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 78517730052.0
num_examples: 12754667
download_size: 21283341524
dataset_size: 78517730052.0
---
# Dataset Card for "olm-october-2022-tokenized-1024-perplexity-filters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713064214 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2789235
num_examples: 6774
download_size: 1394377
dataset_size: 2789235
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mbrack/School_BUD-E | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 2672328170
num_examples: 1116330
download_size: 1187653458
dataset_size: 2672328170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
unography/synth-bg-remove-v1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 3008602601.3
num_examples: 32050
- name: test
num_bytes: 2584899.0
num_examples: 20
download_size: 3005034290
dataset_size: 3011187500.3
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tunachiu/sroie | ---
task_categories:
- token-classification
language:
- en
size_categories:
- 10K<n<100K
--- |
open-llm-leaderboard/details_AA051615__A0304 | ---
pretty_name: Evaluation run of AA051615/A0304
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051615/A0304](https://huggingface.co/AA051615/A0304) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051615__A0304\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T02:38:53.174033](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0304/blob/main/results_2024-03-05T02-38-53.174033.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8329894986297788,\n\
\ \"acc_stderr\": 0.0242885840106617,\n \"acc_norm\": 0.8418851861297316,\n\
\ \"acc_norm_stderr\": 0.024662491905159404,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5334887043569735,\n\
\ \"mc2_stderr\": 0.01542333936417327\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.0048201660022530795,\n \"acc_norm\": 0.8278231428002389,\n\
\ \"acc_norm_stderr\": 0.003767625141611702\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8,\n \
\ \"acc_stderr\": 0.03455473702325438,\n \"acc_norm\": 0.8,\n \"\
acc_norm_stderr\": 0.03455473702325438\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693598,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693598\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n\
\ \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.9056603773584906,\n \"acc_stderr\": 0.017989860174445108,\n\
\ \"acc_norm\": 0.9056603773584906,\n \"acc_norm_stderr\": 0.017989860174445108\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9722222222222222,\n\
\ \"acc_stderr\": 0.013742429025504288,\n \"acc_norm\": 0.9722222222222222,\n\
\ \"acc_norm_stderr\": 0.013742429025504288\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8439306358381503,\n\
\ \"acc_stderr\": 0.027672473701627075,\n \"acc_norm\": 0.8439306358381503,\n\
\ \"acc_norm_stderr\": 0.027672473701627075\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n\
\ \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8765957446808511,\n \"acc_stderr\": 0.02150090885460025,\n\
\ \"acc_norm\": 0.8765957446808511,\n \"acc_norm_stderr\": 0.02150090885460025\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6754385964912281,\n\
\ \"acc_stderr\": 0.044045561573747685,\n \"acc_norm\": 0.6754385964912281,\n\
\ \"acc_norm_stderr\": 0.044045561573747685\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8482758620689655,\n \"acc_stderr\": 0.029896107594574617,\n\
\ \"acc_norm\": 0.8482758620689655,\n \"acc_norm_stderr\": 0.029896107594574617\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.791005291005291,\n \"acc_stderr\": 0.020940481565334852,\n \"\
acc_norm\": 0.791005291005291,\n \"acc_norm_stderr\": 0.020940481565334852\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6190476190476191,\n\
\ \"acc_stderr\": 0.04343525428949099,\n \"acc_norm\": 0.6190476190476191,\n\
\ \"acc_norm_stderr\": 0.04343525428949099\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9516129032258065,\n\
\ \"acc_stderr\": 0.012207189992293699,\n \"acc_norm\": 0.9516129032258065,\n\
\ \"acc_norm_stderr\": 0.012207189992293699\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.7684729064039408,\n \"acc_stderr\": 0.029678333141444455,\n\
\ \"acc_norm\": 0.7684729064039408,\n \"acc_norm_stderr\": 0.029678333141444455\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\"\
: 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9272727272727272,\n \"acc_stderr\": 0.02027824987170499,\n\
\ \"acc_norm\": 0.9272727272727272,\n \"acc_norm_stderr\": 0.02027824987170499\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9646464646464646,\n \"acc_stderr\": 0.013157318878046081,\n \"\
acc_norm\": 0.9646464646464646,\n \"acc_norm_stderr\": 0.013157318878046081\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792215,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792215\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8948717948717949,\n \"acc_stderr\": 0.01555124753133301,\n \
\ \"acc_norm\": 0.8948717948717949,\n \"acc_norm_stderr\": 0.01555124753133301\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.6148148148148148,\n \"acc_stderr\": 0.029670906124630882,\n \
\ \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.029670906124630882\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.9243697478991597,\n \"acc_stderr\": 0.017174988814938515,\n\
\ \"acc_norm\": 0.9243697478991597,\n \"acc_norm_stderr\": 0.017174988814938515\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6622516556291391,\n \"acc_stderr\": 0.03861557546255168,\n \"\
acc_norm\": 0.6622516556291391,\n \"acc_norm_stderr\": 0.03861557546255168\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9522935779816514,\n \"acc_stderr\": 0.009138489155094907,\n \"\
acc_norm\": 0.9522935779816514,\n \"acc_norm_stderr\": 0.009138489155094907\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.028353212866863445,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.028353212866863445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9607843137254902,\n \"acc_stderr\": 0.013623692819208817,\n \"\
acc_norm\": 0.9607843137254902,\n \"acc_norm_stderr\": 0.013623692819208817\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9620253164556962,\n \"acc_stderr\": 0.012441831939410399,\n \
\ \"acc_norm\": 0.9620253164556962,\n \"acc_norm_stderr\": 0.012441831939410399\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8699551569506726,\n\
\ \"acc_stderr\": 0.02257451942417486,\n \"acc_norm\": 0.8699551569506726,\n\
\ \"acc_norm_stderr\": 0.02257451942417486\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9586776859504132,\n \"acc_stderr\": 0.01816929195354394,\n \"\
acc_norm\": 0.9586776859504132,\n \"acc_norm_stderr\": 0.01816929195354394\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9447852760736196,\n \"acc_stderr\": 0.017944712448654605,\n\
\ \"acc_norm\": 0.9447852760736196,\n \"acc_norm_stderr\": 0.017944712448654605\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6964285714285714,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.6964285714285714,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9514563106796117,\n \"acc_stderr\": 0.021279466201922566,\n\
\ \"acc_norm\": 0.9514563106796117,\n \"acc_norm_stderr\": 0.021279466201922566\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n\
\ \"acc_stderr\": 0.014450181176872738,\n \"acc_norm\": 0.9487179487179487,\n\
\ \"acc_norm_stderr\": 0.014450181176872738\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594166,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594166\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.946360153256705,\n\
\ \"acc_stderr\": 0.008056911822364858,\n \"acc_norm\": 0.946360153256705,\n\
\ \"acc_norm_stderr\": 0.008056911822364858\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8554913294797688,\n \"acc_stderr\": 0.018929764513468724,\n\
\ \"acc_norm\": 0.8554913294797688,\n \"acc_norm_stderr\": 0.018929764513468724\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8715083798882681,\n\
\ \"acc_stderr\": 0.01119191561018467,\n \"acc_norm\": 0.8715083798882681,\n\
\ \"acc_norm_stderr\": 0.01119191561018467\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.017995029559531417,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.017995029559531417\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8938906752411575,\n\
\ \"acc_stderr\": 0.017491946161302,\n \"acc_norm\": 0.8938906752411575,\n\
\ \"acc_norm_stderr\": 0.017491946161302\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.017486432785880704,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.017486432785880704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7269503546099291,\n \"acc_stderr\": 0.02657786094330785,\n \
\ \"acc_norm\": 0.7269503546099291,\n \"acc_norm_stderr\": 0.02657786094330785\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.7972620599739244,\n\
\ \"acc_stderr\": 0.010268263054572542,\n \"acc_norm\": 0.7972620599739244,\n\
\ \"acc_norm_stderr\": 0.010268263054572542\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9448529411764706,\n \"acc_stderr\": 0.013866237730790694,\n\
\ \"acc_norm\": 0.9448529411764706,\n \"acc_norm_stderr\": 0.013866237730790694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8839869281045751,\n \"acc_stderr\": 0.012955547759523029,\n \
\ \"acc_norm\": 0.8839869281045751,\n \"acc_norm_stderr\": 0.012955547759523029\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8181818181818182,\n\
\ \"acc_stderr\": 0.03694284335337802,\n \"acc_norm\": 0.8181818181818182,\n\
\ \"acc_norm_stderr\": 0.03694284335337802\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8734693877551021,\n \"acc_stderr\": 0.02128270062614058,\n\
\ \"acc_norm\": 0.8734693877551021,\n \"acc_norm_stderr\": 0.02128270062614058\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.945273631840796,\n\
\ \"acc_stderr\": 0.01608281579626326,\n \"acc_norm\": 0.945273631840796,\n\
\ \"acc_norm_stderr\": 0.01608281579626326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \
\ \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6686746987951807,\n\
\ \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.6686746987951807,\n\
\ \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9415204678362573,\n \"acc_stderr\": 0.01799667885728013,\n\
\ \"acc_norm\": 0.9415204678362573,\n \"acc_norm_stderr\": 0.01799667885728013\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5334887043569735,\n\
\ \"mc2_stderr\": 0.01542333936417327\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345396\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \
\ \"acc_stderr\": 0.013491660298815985\n }\n}\n```"
repo_url: https://huggingface.co/AA051615/A0304
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|arc:challenge|25_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|gsm8k|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hellaswag|10_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T02-38-53.174033.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T02-38-53.174033.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- '**/details_harness|winogrande|5_2024-03-05T02-38-53.174033.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T02-38-53.174033.parquet'
- config_name: results
data_files:
- split: 2024_03_05T02_38_53.174033
path:
- results_2024-03-05T02-38-53.174033.parquet
- split: latest
path:
- results_2024-03-05T02-38-53.174033.parquet
---
# Dataset Card for Evaluation run of AA051615/A0304
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051615/A0304](https://huggingface.co/AA051615/A0304) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051615__A0304",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T02:38:53.174033](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0304/blob/main/results_2024-03-05T02-38-53.174033.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8329894986297788,
"acc_stderr": 0.0242885840106617,
"acc_norm": 0.8418851861297316,
"acc_norm_stderr": 0.024662491905159404,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5334887043569735,
"mc2_stderr": 0.01542333936417327
},
"harness|arc:challenge|25": {
"acc": 0.6296928327645052,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.0048201660022530795,
"acc_norm": 0.8278231428002389,
"acc_norm_stderr": 0.003767625141611702
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8,
"acc_stderr": 0.03455473702325438,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03455473702325438
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693598,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693598
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.9056603773584906,
"acc_stderr": 0.017989860174445108,
"acc_norm": 0.9056603773584906,
"acc_norm_stderr": 0.017989860174445108
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9722222222222222,
"acc_stderr": 0.013742429025504288,
"acc_norm": 0.9722222222222222,
"acc_norm_stderr": 0.013742429025504288
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.027672473701627075,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.027672473701627075
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8765957446808511,
"acc_stderr": 0.02150090885460025,
"acc_norm": 0.8765957446808511,
"acc_norm_stderr": 0.02150090885460025
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6754385964912281,
"acc_stderr": 0.044045561573747685,
"acc_norm": 0.6754385964912281,
"acc_norm_stderr": 0.044045561573747685
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8482758620689655,
"acc_stderr": 0.029896107594574617,
"acc_norm": 0.8482758620689655,
"acc_norm_stderr": 0.029896107594574617
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.791005291005291,
"acc_stderr": 0.020940481565334852,
"acc_norm": 0.791005291005291,
"acc_norm_stderr": 0.020940481565334852
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6190476190476191,
"acc_stderr": 0.04343525428949099,
"acc_norm": 0.6190476190476191,
"acc_norm_stderr": 0.04343525428949099
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9516129032258065,
"acc_stderr": 0.012207189992293699,
"acc_norm": 0.9516129032258065,
"acc_norm_stderr": 0.012207189992293699
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7684729064039408,
"acc_stderr": 0.029678333141444455,
"acc_norm": 0.7684729064039408,
"acc_norm_stderr": 0.029678333141444455
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9272727272727272,
"acc_stderr": 0.02027824987170499,
"acc_norm": 0.9272727272727272,
"acc_norm_stderr": 0.02027824987170499
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9646464646464646,
"acc_stderr": 0.013157318878046081,
"acc_norm": 0.9646464646464646,
"acc_norm_stderr": 0.013157318878046081
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792215,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792215
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8948717948717949,
"acc_stderr": 0.01555124753133301,
"acc_norm": 0.8948717948717949,
"acc_norm_stderr": 0.01555124753133301
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9243697478991597,
"acc_stderr": 0.017174988814938515,
"acc_norm": 0.9243697478991597,
"acc_norm_stderr": 0.017174988814938515
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6622516556291391,
"acc_stderr": 0.03861557546255168,
"acc_norm": 0.6622516556291391,
"acc_norm_stderr": 0.03861557546255168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9522935779816514,
"acc_stderr": 0.009138489155094907,
"acc_norm": 0.9522935779816514,
"acc_norm_stderr": 0.009138489155094907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.028353212866863445,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.028353212866863445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9607843137254902,
"acc_stderr": 0.013623692819208817,
"acc_norm": 0.9607843137254902,
"acc_norm_stderr": 0.013623692819208817
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9620253164556962,
"acc_stderr": 0.012441831939410399,
"acc_norm": 0.9620253164556962,
"acc_norm_stderr": 0.012441831939410399
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8699551569506726,
"acc_stderr": 0.02257451942417486,
"acc_norm": 0.8699551569506726,
"acc_norm_stderr": 0.02257451942417486
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9586776859504132,
"acc_stderr": 0.01816929195354394,
"acc_norm": 0.9586776859504132,
"acc_norm_stderr": 0.01816929195354394
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9447852760736196,
"acc_stderr": 0.017944712448654605,
"acc_norm": 0.9447852760736196,
"acc_norm_stderr": 0.017944712448654605
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6964285714285714,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.6964285714285714,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.9514563106796117,
"acc_stderr": 0.021279466201922566,
"acc_norm": 0.9514563106796117,
"acc_norm_stderr": 0.021279466201922566
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872738,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872738
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594166,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594166
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.946360153256705,
"acc_stderr": 0.008056911822364858,
"acc_norm": 0.946360153256705,
"acc_norm_stderr": 0.008056911822364858
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8554913294797688,
"acc_stderr": 0.018929764513468724,
"acc_norm": 0.8554913294797688,
"acc_norm_stderr": 0.018929764513468724
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8715083798882681,
"acc_stderr": 0.01119191561018467,
"acc_norm": 0.8715083798882681,
"acc_norm_stderr": 0.01119191561018467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.017995029559531417,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.017995029559531417
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8938906752411575,
"acc_stderr": 0.017491946161302,
"acc_norm": 0.8938906752411575,
"acc_norm_stderr": 0.017491946161302
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.017486432785880704,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.017486432785880704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7269503546099291,
"acc_stderr": 0.02657786094330785,
"acc_norm": 0.7269503546099291,
"acc_norm_stderr": 0.02657786094330785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.7972620599739244,
"acc_stderr": 0.010268263054572542,
"acc_norm": 0.7972620599739244,
"acc_norm_stderr": 0.010268263054572542
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9448529411764706,
"acc_stderr": 0.013866237730790694,
"acc_norm": 0.9448529411764706,
"acc_norm_stderr": 0.013866237730790694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8839869281045751,
"acc_stderr": 0.012955547759523029,
"acc_norm": 0.8839869281045751,
"acc_norm_stderr": 0.012955547759523029
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03694284335337802,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03694284335337802
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8734693877551021,
"acc_stderr": 0.02128270062614058,
"acc_norm": 0.8734693877551021,
"acc_norm_stderr": 0.02128270062614058
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.945273631840796,
"acc_stderr": 0.01608281579626326,
"acc_norm": 0.945273631840796,
"acc_norm_stderr": 0.01608281579626326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6686746987951807,
"acc_stderr": 0.03664314777288086,
"acc_norm": 0.6686746987951807,
"acc_norm_stderr": 0.03664314777288086
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9415204678362573,
"acc_stderr": 0.01799667885728013,
"acc_norm": 0.9415204678362573,
"acc_norm_stderr": 0.01799667885728013
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5334887043569735,
"mc2_stderr": 0.01542333936417327
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345396
},
"harness|gsm8k|5": {
"acc": 0.6004548900682335,
"acc_stderr": 0.013491660298815985
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v4 | ---
pretty_name: Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeonsworld/CarbonVillain-en-10.7B-v4](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T18:31:04.687700](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v4/blob/main/results_2023-12-30T18-31-04.687700.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.666631910220006,\n\
\ \"acc_stderr\": 0.031628172453272874,\n \"acc_norm\": 0.6673323282914742,\n\
\ \"acc_norm_stderr\": 0.032273428394848376,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7195429760825822,\n\
\ \"mc2_stderr\": 0.014995726763948506\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7141007767377017,\n\
\ \"acc_stderr\": 0.00450918191932285,\n \"acc_norm\": 0.8847839075881299,\n\
\ \"acc_norm_stderr\": 0.0031863002304505757\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"\
acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4941329856584094,\n\
\ \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.4941329856584094,\n\
\ \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7195429760825822,\n\
\ \"mc2_stderr\": 0.014995726763948506\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6557998483699773,\n \
\ \"acc_stderr\": 0.013086800426693784\n }\n}\n```"
repo_url: https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|arc:challenge|25_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|gsm8k|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hellaswag|10_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-31-04.687700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T18-31-04.687700.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- '**/details_harness|winogrande|5_2023-12-30T18-31-04.687700.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T18-31-04.687700.parquet'
- config_name: results
data_files:
- split: 2023_12_30T18_31_04.687700
path:
- results_2023-12-30T18-31-04.687700.parquet
- split: latest
path:
- results_2023-12-30T18-31-04.687700.parquet
---
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-10.7B-v4](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T18:31:04.687700](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v4/blob/main/results_2023-12-30T18-31-04.687700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.666631910220006,
"acc_stderr": 0.031628172453272874,
"acc_norm": 0.6673323282914742,
"acc_norm_stderr": 0.032273428394848376,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7195429760825822,
"mc2_stderr": 0.014995726763948506
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7141007767377017,
"acc_stderr": 0.00450918191932285,
"acc_norm": 0.8847839075881299,
"acc_norm_stderr": 0.0031863002304505757
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4941329856584094,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.4941329856584094,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7195429760825822,
"mc2_stderr": 0.014995726763948506
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6557998483699773,
"acc_stderr": 0.013086800426693784
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kojima-r/birddb_small2 | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1011384430.775
num_examples: 77501
download_size: 2139041561
dataset_size: 1011384430.775
---
# Dataset Card for "birddb_small2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FarhatMay/coco_celeba_700 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 43625153.0
num_examples: 699
download_size: 43509053
dataset_size: 43625153.0
---
# Dataset Card for "coco_celeba_700"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dtthanh/llama_test | ---
language:
- vi
license: llama2
size_categories:
- n<1K
task_categories:
- question-answering
pretty_name: Uncle Dao Demo
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 203506
num_examples: 449
download_size: 0
dataset_size: 203506
---
|
irds/mmarco_pt_train_v1.1 | ---
pretty_name: '`mmarco/pt/train/v1.1`'
viewer: false
source_datasets: ['irds/mmarco_pt', 'irds/mmarco_pt_train']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/pt/train/v1.1`
The `mmarco/pt/train/v1.1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/pt/train/v1.1).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- For `docs`, use [`irds/mmarco_pt`](https://huggingface.co/datasets/irds/mmarco_pt)
- For `qrels`, use [`irds/mmarco_pt_train`](https://huggingface.co/datasets/irds/mmarco_pt_train)
- For `docpairs`, use [`irds/mmarco_pt_train`](https://huggingface.co/datasets/irds/mmarco_pt_train)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_pt_train_v1.1', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
mstz/nursery | ---
language:
- en
tags:
- nursery
- tabular_classification
- UCI
pretty_name: Nursery
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- nursery
- nursery_binary
license: cc
---
# Nursery
The [Nursery dataset](https://archive-beta.ics.uci.edu/dataset/76/nursery) from the [UCI repository](https://archive-beta.ics.uci.edu/).
Should the nursery school accept the student application?
# Configurations and tasks
| **Configuration** | **Task** |
|-------------------|---------------------------|
| nursery | Multiclass classification |
| nursery_binary | Binary classification | |
joelniklaus/greek_legal_ner | ---
annotations_creators:
- other
language_creators:
- found
language:
- el
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: null
pretty_name: Greek Legal Named Entity Recognition
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
tags:
- legal
---
# Dataset Card for Greek Legal Named Entity Recognition
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://legislation.di.uoa.gr/publications?language=en
- **Repository:**
- **Paper:** Angelidis, I., Chalkidis, I., & Koubarakis, M. (2018). Named Entity Recognition, Linking and Generation for Greek Legislation. JURIX.
- **Leaderboard:**
- **Point of Contact:** [Ilias Chalkidis](mailto:ilias.chalkidis@di.ku.dk); [Joel Niklaus](mailto:joel.niklaus.2@bfh.ch)
### Dataset Summary
This dataset contains an annotated corpus for named entity recognition in Greek legislations. It is the first of its kind for the Greek language in such an extended form and one of the few that examines legal text in a full spectrum entity recognition.
### Supported Tasks and Leaderboards
The dataset supports the task of named entity recognition.
### Languages
The language in the dataset is Greek as it used in the Greek Government Gazette.
## Dataset Structure
### Data Instances
The file format is jsonl and three data splits are present (train, validation and test).
### Data Fields
The files contain the following data fields
- `date`: The date when the document was published.
- `gazette`: The government gazette of the document. Either `A` or `D`
- `A` is the general one, publishing standard legislation
- `D` is meant for legislation on urban planning and such things
- `words`: The list of tokens obtained by applying the spacy (v 3.3.1) Greek tokenizer on the sentences. For more information see `convert_to_hf_dataset.py`.
- `ner`: The list of ner tags. The list of labels for the named entities that are covered by the dataset are the following:
- `FACILITY`: Facilities, such as police stations, departments etc.
- `GPE`: Geopolitical Entity; any reference to a geopolitical entity (e.g., country, city, Greek administrative unit, etc.)
- `LEG-REFS`: Legislation Reference; any reference to Greek or European legislation (e.g., Presidential Decrees, Laws, Decisions, EU Regulations and Directives, etc.)
- `LOCATION-NAT`: Well defined natural location, such as rivers, mountains, lakes etc.
- `LOCATION-UNK`: Poorly defined locations such "End of road X" or other locations that are not "official".
- `ORG`: Organization; any reference to a public or private organization, such as: international organizations (e.g., European Union, United Nations, etc.), Greek public organizations (e.g., Social Insurance Institution) or private ones (e.g., companies, NGOs, etc.).
- `PERSON`: Any formal name of a person mentioned in the text (e.g., Greek government members, public administration officials, etc.).
- `PUBLIC-DOCS`: Public Document Reference; any reference to documents or decisions that have been published by a public institution (organization) that are not considered a primary source of legislation (e.g., local decisions, announcements, memorandums, directives).
- `O`: No entity annotation present
The final tagset (in IOB notation) is the following: `['O', 'B-ORG', 'I-ORG', 'B-GPE', 'I-GPE', 'B-LEG-REFS', 'I-LEG-REFS', 'B-PUBLIC-DOCS', 'I-PUBLIC-DOCS', 'B-PERSON', 'I-PERSON', 'B-FACILITY', 'I-FACILITY', 'B-LOCATION-UNK', 'I-LOCATION-UNK', 'B-LOCATION-NAT', 'I-LOCATION-NAT']`
### Data Splits
The dataset has three splits: *train*, *validation* and *test*.
Split across the documents:
| split | number of documents |
|:---------------|--------------------:|
| train | 23723 |
| validation | 5478 |
| test | 5084 |
Split across NER labels
| NER label + split | number of instances |
|:-----------------------------------------------|----------------------:|
| ('FACILITY', 'test') | 142 |
| ('FACILITY', 'train') | 1224 |
| ('FACILITY', 'validation') | 60 |
| ('GPE', 'test') | 1083 |
| ('GPE', 'train') | 5400 |
| ('GPE', 'validation') | 1214 |
| ('LEG-REFS', 'test') | 1331 |
| ('LEG-REFS', 'train') | 5159 |
| ('LEG-REFS', 'validation') | 1382 |
| ('LOCATION-NAT', 'test') | 26 |
| ('LOCATION-NAT', 'train') | 145 |
| ('LOCATION-NAT', 'validation') | 2 |
| ('LOCATION-UNK', 'test') | 205 |
| ('LOCATION-UNK', 'train') | 1316 |
| ('LOCATION-UNK', 'validation') | 283 |
| ('ORG', 'test') | 1354 |
| ('ORG', 'train') | 5906 |
| ('ORG', 'validation') | 1506 |
| ('PERSON', 'test') | 491 |
| ('PERSON', 'train') | 1921 |
| ('PERSON', 'validation') | 475 |
| ('PUBLIC-DOCS', 'test') | 452 |
| ('PUBLIC-DOCS', 'train') | 2652 |
| ('PUBLIC-DOCS', 'validation') | 556 |
## Dataset Creation
### Curation Rationale
Creating a big dataset for Greek named entity recognition and entity linking.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Greek Government Gazette
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
According to (Angelidis et al., 2018) the authors of the paper annotated the data: *"Our group annotated all of the above documents for the 6 entity types that we examine."*
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Note that the information given in this dataset card refer to the dataset version as provided by Joel Niklaus and Veton Matoshi. The dataset at hand is intended to be part of a bigger benchmark dataset. Creating a benchmark dataset consisting of several other datasets from different sources requires postprocessing. Therefore, the structure of the dataset at hand, including the folder structure, may differ considerably from the original dataset. In addition to that, differences with regard to dataset statistics as give in the respective papers can be expected. The reader is advised to have a look at the conversion script ```convert_to_hf_dataset.py``` in order to retrace the steps for converting the original dataset into the present jsonl-format. For further information on the original dataset structure, we refer to the bibliographical references and the original Github repositories and/or web pages provided in this dataset card.
## Additional Information
### Dataset Curators
The names of the original dataset curators and creators can be found in references given below, in the section *Citation Information*.
Additional changes were made by Joel Niklaus ([Email](mailto:joel.niklaus.2@bfh.ch); [Github](https://github.com/joelniklaus)) and Veton Matoshi ([Email](mailto:veton.matoshi@bfh.ch); [Github](https://github.com/kapllan)).
### Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/)
### Citation Information
```
@inproceedings{Angelidis2018NamedER,
author = {Angelidis, Iosif and Chalkidis, Ilias and Koubarakis, Manolis},
booktitle = {JURIX},
keywords = {greek,legal nlp,named entity recognition},
title = {{Named Entity Recognition, Linking and Generation for Greek Legislation}},
year = {2018}
}
```
### Contributions
Thanks to [@JoelNiklaus](https://github.com/joelniklaus) and [@kapllan](https://github.com/kapllan) for adding this dataset. |
dmargutierrez/chicago_early_childhood_education_centers | ---
dataset_info:
features:
- name: Id
dtype: int64
- name: Site name
dtype: string
- name: Address
dtype: string
- name: Zip
dtype: float64
- name: Phone
dtype: float64
- name: Program Name
dtype: string
- name: Length of Day
dtype: string
- name: Neighborhood
dtype: string
- name: Funded Enrollment
dtype: string
- name: Program Option
dtype: string
- name: Eearly Head Start Fund
dtype: string
- name: CC fund
dtype: string
- name: Progmod
dtype: string
- name: Website
dtype: string
- name: Center Director
dtype: string
- name: ECE Available Programs
dtype: string
- name: NAEYC Valid Until
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
'10': '10'
'11': '11'
'12': '12'
'13': '339'
'14': '13'
'15': '14'
'16': '15'
'17': '16'
'18': '17'
'19': '18'
'20': '19'
'21': '20'
'22': '21'
'23': '22'
'24': '23'
'25': '24'
'26': '25'
'27': '26'
'28': '27'
'29': '386'
'30': '28'
'31': '29'
'32': '30'
'33': '31'
'34': '32'
'35': '33'
'36': '34'
'37': '35'
'38': '36'
'39': '37'
'40': '38'
'41': '39'
'42': '40'
'43': '41'
'44': '42'
'45': '43'
'46': '44'
'47': '45'
'48': '46'
'49': '47'
'50': '48'
'51': '49'
'52': '50'
'53': '51'
'54': '52'
'55': '53'
'56': '54'
'57': '55'
'58': '56'
'59': '57'
'60': '58'
'61': '59'
'62': '60'
'63': '61'
'64': '62'
'65': '63'
'66': '64'
'67': '65'
'68': '66'
'69': '67'
'70': '68'
'71': '69'
'72': '70'
'73': '71'
'74': '72'
'75': '73'
'76': '74'
'77': '75'
'78': '875'
'79': '884'
'80': '76'
'81': '77'
'82': '78'
'83': '79'
'84': '80'
'85': '81'
'86': '82'
'87': '83'
'88': '84'
'89': '85'
'90': '86'
'91': '87'
'92': '88'
'93': '89'
'94': '90'
'95': '91'
'96': '92'
'97': '93'
'98': '94'
'99': '95'
'100': '96'
'101': '97'
'102': '98'
'103': '99'
'104': '100'
'105': '101'
'106': '102'
'107': '103'
'108': '104'
'109': '105'
'110': '106'
'111': '107'
'112': '108'
'113': '109'
'114': '110'
'115': '111'
'116': '112'
'117': '113'
'118': '114'
'119': '115'
'120': '116'
'121': '117'
'122': '118'
'123': '119'
'124': '120'
'125': '121'
'126': '122'
'127': '123'
'128': '124'
'129': '125'
'130': '126'
'131': '127'
'132': '128'
'133': '129'
'134': '130'
'135': '131'
'136': '132'
'137': '133'
'138': '134'
'139': '135'
'140': '136'
'141': '137'
'142': '138'
'143': '139'
'144': '140'
'145': '141'
'146': '142'
'147': '143'
'148': '144'
'149': '145'
'150': '146'
'151': '249'
'152': '147'
'153': '148'
'154': '149'
'155': '150'
'156': '151'
'157': '152'
'158': '153'
'159': '154'
'160': '155'
'161': '156'
'162': '157'
'163': '158'
'164': '159'
'165': '160'
'166': '161'
'167': '162'
'168': '163'
'169': '164'
'170': '165'
'171': '166'
'172': '167'
'173': '168'
'174': '169'
'175': '170'
'176': '171'
'177': '172'
'178': '173'
'179': '174'
'180': '175'
'181': '176'
'182': '177'
'183': '178'
'184': '179'
'185': '180'
'186': '181'
'187': '182'
'188': '183'
'189': '189'
'190': '184'
'191': '185'
'192': '186'
'193': '187'
'194': '188'
'195': '190'
'196': '191'
'197': '192'
'198': '193'
'199': '194'
'200': '195'
'201': '196'
'202': '197'
'203': '198'
'204': '199'
'205': '200'
'206': '201'
'207': '202'
'208': '203'
'209': '204'
'210': '205'
'211': '206'
'212': '207'
'213': '208'
'214': '209'
'215': '210'
'216': '211'
'217': '212'
'218': '213'
'219': '214'
'220': '215'
'221': '216'
'222': '217'
'223': '218'
'224': '219'
'225': '220'
'226': '221'
'227': '222'
'228': '223'
'229': '224'
'230': '225'
'231': '226'
'232': '227'
'233': '228'
'234': '229'
'235': '230'
'236': '231'
'237': '232'
'238': '233'
'239': '234'
'240': '235'
'241': '236'
'242': '237'
'243': '238'
'244': '239'
'245': '240'
'246': '241'
'247': '242'
'248': '243'
'249': '244'
'250': '245'
'251': '246'
'252': '247'
'253': '248'
'254': '250'
'255': '251'
'256': '252'
'257': '253'
'258': '254'
'259': '255'
'260': '256'
'261': '257'
'262': '258'
'263': '259'
'264': '260'
'265': '261'
'266': '262'
'267': '263'
'268': '264'
'269': '265'
'270': '266'
'271': '267'
'272': '268'
'273': '269'
'274': '270'
'275': '271'
'276': '272'
'277': '273'
'278': '274'
'279': '275'
'280': '276'
'281': '277'
'282': '278'
'283': '279'
'284': '280'
'285': '281'
'286': '282'
'287': '283'
'288': '284'
'289': '285'
'290': '286'
'291': '287'
'292': '288'
'293': '289'
'294': '290'
'295': '291'
'296': '292'
'297': '293'
'298': '294'
'299': '295'
'300': '296'
'301': '297'
'302': '298'
'303': '299'
'304': '300'
'305': '301'
'306': '302'
'307': '303'
'308': '304'
'309': '305'
'310': '306'
'311': '307'
'312': '308'
'313': '309'
'314': '310'
'315': '311'
'316': '312'
'317': '313'
'318': '314'
'319': '315'
'320': '316'
'321': '317'
'322': '318'
'323': '319'
'324': '320'
'325': '321'
'326': '322'
'327': '323'
'328': '324'
'329': '325'
'330': '326'
'331': '327'
'332': '328'
'333': '329'
'334': '330'
'335': '331'
'336': '332'
'337': '333'
'338': '334'
'339': '335'
'340': '336'
'341': '337'
'342': '338'
'343': '340'
'344': '341'
'345': '342'
'346': '343'
'347': '344'
'348': '345'
'349': '346'
'350': '347'
'351': '348'
'352': '349'
'353': '350'
'354': '351'
'355': '352'
'356': '353'
'357': '354'
'358': '355'
'359': '356'
'360': '357'
'361': '358'
'362': '359'
'363': '360'
'364': '361'
'365': '362'
'366': '363'
'367': '364'
'368': '365'
'369': '366'
'370': '367'
'371': '368'
'372': '369'
'373': '370'
'374': '371'
'375': '372'
'376': '373'
'377': '374'
'378': '375'
'379': '376'
'380': '377'
'381': '378'
'382': '379'
'383': '380'
'384': '381'
'385': '382'
'386': '383'
'387': '384'
'388': '385'
'389': '387'
'390': '388'
'391': '389'
'392': '390'
'393': '391'
'394': '392'
'395': '393'
'396': '394'
'397': '395'
'398': '396'
'399': '397'
'400': '398'
'401': '399'
'402': '400'
'403': '401'
'404': '402'
'405': '403'
'406': '404'
'407': '405'
'408': '406'
'409': '407'
'410': '408'
'411': '409'
'412': '410'
'413': '411'
'414': '412'
'415': '413'
'416': '414'
'417': '415'
'418': '416'
'419': '417'
'420': '418'
'421': '419'
'422': '420'
'423': '421'
'424': '422'
'425': '423'
'426': '424'
'427': '425'
'428': '426'
'429': '427'
'430': '428'
'431': '429'
'432': '430'
'433': '431'
'434': '432'
'435': '433'
'436': '434'
'437': '435'
'438': '436'
'439': '437'
'440': '438'
'441': '439'
'442': '440'
'443': '441'
'444': '442'
'445': '443'
'446': '444'
'447': '445'
'448': '446'
'449': '447'
'450': '448'
'451': '449'
'452': '450'
'453': '451'
'454': '452'
'455': '453'
'456': '454'
'457': '455'
'458': '456'
'459': '457'
'460': '458'
'461': '459'
'462': '460'
'463': '461'
'464': '462'
'465': '463'
'466': '464'
'467': '465'
'468': '466'
'469': '467'
'470': '468'
'471': '469'
'472': '470'
'473': '471'
'474': '472'
'475': '473'
'476': '474'
'477': '475'
'478': '476'
'479': '477'
'480': '478'
'481': '479'
'482': '480'
'483': '481'
'484': '482'
'485': '483'
'486': '484'
'487': '485'
'488': '486'
'489': '487'
'490': '488'
'491': '489'
'492': '490'
'493': '491'
'494': '492'
'495': '493'
'496': '494'
'497': '495'
'498': '496'
'499': '497'
'500': '498'
'501': '499'
'502': '500'
'503': '501'
'504': '502'
'505': '503'
'506': '504'
'507': '505'
'508': '506'
'509': '507'
'510': '508'
'511': '509'
'512': '510'
'513': '511'
'514': '512'
'515': '513'
'516': '514'
'517': '515'
'518': '516'
'519': '517'
'520': '518'
'521': '519'
'522': '520'
'523': '521'
'524': '522'
'525': '523'
'526': '524'
'527': '525'
'528': '526'
'529': '527'
'530': '528'
'531': '529'
'532': '530'
'533': '531'
'534': '532'
'535': '533'
'536': '534'
'537': '535'
'538': '536'
'539': '537'
'540': '538'
'541': '539'
'542': '540'
'543': '541'
'544': '542'
'545': '543'
'546': '544'
'547': '545'
'548': '546'
'549': '547'
'550': '548'
'551': '549'
'552': '550'
'553': '551'
'554': '552'
'555': '553'
'556': '554'
'557': '555'
'558': '556'
'559': '557'
'560': '558'
'561': '559'
'562': '560'
'563': '561'
'564': '562'
'565': '563'
'566': '564'
'567': '565'
'568': '566'
'569': '567'
'570': '568'
'571': '569'
'572': '570'
'573': '571'
'574': '572'
'575': '573'
'576': '574'
'577': '575'
'578': '576'
'579': '577'
'580': '578'
'581': '579'
'582': '580'
'583': '581'
'584': '582'
'585': '583'
'586': '584'
'587': '585'
'588': '586'
'589': '587'
'590': '588'
'591': '589'
'592': '590'
'593': '591'
'594': '592'
'595': '593'
'596': '594'
'597': '595'
'598': '596'
'599': '597'
'600': '598'
'601': '599'
'602': '600'
'603': '601'
'604': '602'
'605': '603'
'606': '604'
'607': '605'
'608': '606'
'609': '607'
'610': '608'
'611': '609'
'612': '610'
'613': '611'
'614': '612'
'615': '613'
'616': '614'
'617': '615'
'618': '616'
'619': '617'
'620': '618'
'621': '619'
'622': '620'
'623': '621'
'624': '622'
'625': '623'
'626': '624'
'627': '625'
'628': '626'
'629': '627'
'630': '628'
'631': '629'
'632': '630'
'633': '631'
'634': '632'
'635': '633'
'636': '634'
'637': '635'
'638': '636'
'639': '637'
'640': '638'
'641': '639'
'642': '640'
'643': '641'
'644': '642'
'645': '643'
'646': '644'
'647': '645'
'648': '646'
'649': '647'
'650': '648'
'651': '649'
'652': '650'
'653': '651'
'654': '652'
'655': '653'
'656': '654'
'657': '655'
'658': '656'
'659': '657'
'660': '658'
'661': '659'
'662': '660'
'663': '661'
'664': '662'
'665': '663'
'666': '664'
'667': '665'
'668': '666'
'669': '667'
'670': '668'
'671': '669'
'672': '670'
'673': '671'
'674': '683'
'675': '672'
'676': '673'
'677': '674'
'678': '675'
'679': '676'
'680': '677'
'681': '678'
'682': '679'
'683': '680'
'684': '681'
'685': '682'
'686': '684'
'687': '685'
'688': '686'
'689': '687'
'690': '688'
'691': '689'
'692': '690'
'693': '691'
'694': '692'
'695': '693'
'696': '694'
'697': '695'
'698': '696'
'699': '697'
'700': '698'
'701': '699'
'702': '700'
'703': '701'
'704': '702'
'705': '703'
'706': '704'
'707': '705'
'708': '706'
'709': '707'
'710': '708'
'711': '709'
'712': '710'
'713': '711'
'714': '712'
'715': '713'
'716': '714'
'717': '715'
'718': '716'
'719': '717'
'720': '718'
'721': '719'
'722': '720'
'723': '721'
'724': '722'
'725': '723'
'726': '724'
'727': '739'
'728': '725'
'729': '726'
'730': '727'
'731': '728'
'732': '729'
'733': '730'
'734': '731'
'735': '732'
'736': '733'
'737': '734'
'738': '735'
'739': '736'
'740': '737'
'741': '738'
'742': '740'
'743': '741'
'744': '742'
'745': '743'
'746': '744'
'747': '745'
'748': '746'
'749': '747'
'750': '748'
'751': '749'
'752': '750'
'753': '751'
'754': '752'
'755': '753'
'756': '754'
'757': '755'
'758': '756'
'759': '757'
'760': '758'
'761': '759'
'762': '760'
'763': '761'
'764': '762'
'765': '763'
'766': '764'
'767': '765'
'768': '766'
'769': '767'
'770': '768'
'771': '769'
'772': '770'
'773': '771'
'774': '772'
'775': '773'
'776': '774'
'777': '775'
'778': '776'
'779': '777'
'780': '778'
'781': '779'
'782': '780'
'783': '781'
'784': '782'
'785': '783'
'786': '784'
'787': '785'
'788': '786'
'789': '787'
'790': '788'
'791': '789'
'792': '790'
'793': '791'
'794': '792'
'795': '793'
'796': '794'
'797': '795'
'798': '796'
'799': '797'
'800': '798'
'801': '799'
'802': '800'
'803': '801'
'804': '802'
'805': '803'
'806': '804'
'807': '805'
'808': '806'
'809': '807'
'810': '808'
'811': '809'
'812': '810'
'813': '811'
'814': '812'
'815': '813'
'816': '814'
'817': '815'
'818': '816'
'819': '817'
'820': '818'
'821': '819'
'822': '820'
'823': '821'
'824': '822'
'825': '823'
'826': '824'
'827': '825'
'828': '826'
'829': '827'
'830': '828'
'831': '829'
'832': '830'
'833': '831'
'834': '832'
'835': '833'
'836': '834'
'837': '835'
'838': '836'
'839': '837'
'840': '838'
'841': '839'
'842': '840'
'843': '841'
'844': '842'
'845': '843'
'846': '844'
'847': '845'
'848': '846'
'849': '847'
'850': '848'
'851': '849'
'852': '850'
'853': '851'
'854': '852'
'855': '853'
'856': '854'
'857': '855'
'858': '856'
'859': '857'
'860': '858'
'861': '859'
'862': '860'
'863': '861'
'864': '862'
'865': '863'
'866': '864'
'867': '865'
'868': '866'
'869': '867'
'870': '868'
'871': '869'
'872': '870'
'873': '871'
'874': '872'
'875': '873'
'876': '874'
'877': '876'
'878': '877'
'879': '878'
'880': '879'
'881': '880'
'882': '881'
'883': '882'
'884': '883'
'885': '885'
'886': '886'
'887': '887'
'888': '888'
'889': '889'
'890': '890'
'891': '891'
'892': '892'
'893': '893'
'894': '894'
'895': '895'
'896': '896'
'897': '897'
'898': '898'
'899': '899'
'900': '900'
'901': '901'
'902': '902'
'903': '903'
'904': '904'
'905': '905'
'906': '906'
'907': '907'
'908': '908'
'909': '909'
'910': '910'
'911': '911'
'912': '912'
'913': '913'
'914': '914'
'915': '915'
'916': '916'
'917': '917'
'918': '918'
'919': '919'
'920': '920'
'921': '921'
'922': '922'
'923': '923'
'924': '924'
'925': '925'
'926': '926'
'927': '927'
'928': '928'
'929': '929'
'930': '930'
'931': '931'
'932': '932'
'933': '933'
'934': '934'
'935': '935'
'936': '936'
'937': '937'
'938': '938'
'939': '939'
'940': '940'
'941': '941'
'942': '942'
'943': '943'
'944': '944'
'945': '945'
'946': '946'
'947': '947'
'948': '948'
'949': '949'
'950': '950'
'951': '951'
'952': '952'
'953': '953'
'954': '954'
'955': '955'
'956': '956'
'957': '957'
'958': '958'
'959': '959'
'960': '960'
'961': '961'
'962': '962'
'963': '963'
'964': '964'
'965': '965'
'966': '966'
'967': '967'
'968': '968'
'969': '969'
'970': '970'
'971': '971'
'972': '972'
'973': '973'
'974': '974'
'975': '975'
'976': '976'
'977': '977'
'978': '978'
'979': '979'
'980': '980'
'981': '981'
'982': '982'
'983': '983'
'984': '984'
'985': '985'
'986': '986'
'987': '987'
'988': '988'
'989': '989'
'990': '990'
'991': '991'
'992': '992'
'993': '993'
'994': '994'
'995': '995'
'996': '996'
'997': '997'
'998': '998'
'999': '999'
'1000': '1000'
'1001': '1001'
'1002': '1002'
'1003': '1003'
'1004': '1004'
'1005': '1005'
'1006': '1006'
'1007': '1007'
'1008': '1008'
'1009': '1009'
'1010': '1010'
'1011': '1011'
'1012': '1012'
'1013': '1013'
'1014': '1014'
'1015': '1015'
'1016': '1016'
'1017': '1017'
'1018': '1018'
'1019': '1019'
'1020': '1020'
'1021': '1021'
'1022': '1022'
'1023': '1023'
'1024': '1024'
'1025': '1025'
'1026': '1026'
'1027': '1027'
'1028': '1028'
'1029': '1029'
'1030': '1030'
'1031': '1031'
'1032': '1032'
'1033': '1033'
'1034': '1034'
'1035': '1035'
'1036': '1036'
'1037': '1037'
'1038': '1038'
'1039': '1039'
'1040': '1040'
'1041': '1041'
'1042': '1042'
'1043': '1043'
'1044': '1044'
'1045': '1045'
'1046': '1046'
'1047': '1047'
'1048': '1048'
'1049': '1049'
'1050': '1050'
'1051': '1051'
'1052': '1052'
'1053': '1053'
'1054': '1054'
'1055': '1055'
'1056': '1056'
'1057': '1057'
'1058': '1058'
'1059': '1059'
'1060': '1060'
'1061': '1061'
'1062': '1062'
'1063': '1063'
'1064': '1064'
'1065': '1065'
'1066': '1066'
'1067': '1067'
'1068': '1068'
'1069': '1069'
'1070': '1070'
'1071': '1071'
'1072': '1072'
'1073': '1073'
'1074': '1074'
'1075': '1075'
'1076': '1076'
'1077': '1077'
'1078': '1078'
'1079': '1079'
'1080': '1080'
'1081': '1081'
'1082': '1082'
'1083': '1083'
'1084': '1084'
'1085': '1085'
'1086': '1086'
'1087': '1087'
'1088': '1088'
'1089': '1089'
'1090': '1090'
'1091': '1091'
'1092': '1092'
'1093': '1093'
'1094': '1094'
'1095': '1095'
'1096': '1096'
'1097': '1097'
'1098': '1098'
'1099': '1099'
'1100': '1100'
'1101': '1101'
'1102': '1102'
'1103': '1103'
'1104': '1104'
'1105': '1105'
'1106': '1106'
'1107': '1107'
'1108': '1108'
'1109': '1109'
'1110': '1110'
'1111': '1111'
'1112': '1112'
'1113': '1113'
'1114': '1114'
'1115': '1115'
'1116': '1116'
'1117': '1117'
'1118': '1118'
'1119': '1119'
'1120': '1120'
'1121': '1121'
'1122': '1122'
'1123': '1123'
'1124': '1124'
'1125': '1125'
'1126': '1126'
'1127': '1127'
'1128': '1128'
'1129': '1129'
'1130': '1130'
'1131': '1131'
'1132': '1132'
'1133': '1133'
'1134': '1134'
'1135': '1135'
'1136': '1136'
'1137': '1137'
'1138': '1138'
'1139': '1139'
'1140': '1140'
'1141': '1141'
'1142': '1142'
'1143': '1143'
'1144': '1144'
'1145': '1145'
'1146': '1146'
'1147': '1147'
'1148': '1148'
'1149': '1149'
'1150': '1150'
'1151': '1151'
'1152': '1152'
'1153': '1153'
'1154': '1154'
'1155': '1155'
'1156': '1156'
'1157': '1157'
'1158': '1158'
'1159': '1159'
'1160': '1160'
'1161': '1161'
- name: tisix_row_index
dtype: string
splits:
- name: train
num_bytes: 662438
num_examples: 3337
download_size: 247923
dataset_size: 662438
---
# Dataset Card for "chicago_early_childhood_education_centers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ArhamNaeem/code-gen-train | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 168825
num_examples: 610
download_size: 73824
dataset_size: 168825
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pccl-org/formal-logic-simple-order-multi-token-dynamic-objects-paired-relationship-0-100000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 14125899544
num_examples: 49775250
download_size: 4793803997
dataset_size: 14125899544
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LightChen2333/OpenSLU | ---
license: mit
---
|
wheelernba/AIon | ---
license: mit
---
|
AdapterOcean/physics_dataset_standardized_unified | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 49677244
num_examples: 19999
download_size: 22747201
dataset_size: 49677244
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_unified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ioclab/lighttestout | ---
dataset_info:
features:
- name: image
dtype: image
- name: tags
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 846755243.74
num_examples: 3970
download_size: 843460816
dataset_size: 846755243.74
---
# Dataset Card for "lighttestout"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hard | ---
annotations_creators:
- found
language_creators:
- found
language:
- ar
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
paperswithcode_id: hard
pretty_name: Hotel Arabic-Reviews Dataset
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
config_name: plain_text
splits:
- name: train
num_bytes: 27507085
num_examples: 105698
download_size: 8508677
dataset_size: 27507085
---
# Dataset Card for Hard
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Hard](https://github.com/elnagara/HARD-Arabic-Dataset)
- **Repository:** [Hard](https://github.com/elnagara/HARD-Arabic-Dataset)
- **Paper:** [Hotel Arabic-Reviews Dataset Construction for Sentiment Analysis Applications](https://link.springer.com/chapter/10.1007/978-3-319-67056-0_3)
- **Point of Contact:** [Ashraf Elnagar](ashraf@sharjah.ac.ae)
### Dataset Summary
This dataset contains 93,700 hotel reviews in Arabic language.The hotel reviews were collected from Booking.com website during June/July 2016.The reviews are expressed in Modern Standard Arabic as well as dialectal Arabic.The following table summarize some tatistics on the HARD Dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is based on Arabic.
## Dataset Structure
### Data Instances
A typical data point comprises a rating from 1 to 5 for hotels.
### Data Fields
[More Information Needed]
### Data Splits
The dataset is not split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
### Contributions
Thanks to [@zaidalyafeai](https://github.com/zaidalyafeai) for adding this dataset. |
alinet/balanced_qg | ---
language:
- en
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: train.csv
- split: validation
path: validation.csv
- config_name: resolved
data_files:
- split: train
path: train_resolved.csv
- split: validation
path: validation.csv
- config_name: augmented
data_files:
- split: train
path: train_augmented.csv
- split: validation
path: validation.csv
- config_name: resolved_augmented
data_files:
- split: train
path: train_resolved_augmented.csv
- split: validation
path: validation.csv
tags:
- question-generation
task_categories:
- text2text-generation
--- |
Falah/flower_arrangement | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 367584
num_examples: 1000
download_size: 41547
dataset_size: 367584
---
# Dataset Card for "flower_arrangement"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_itsliupeng__llama2_70b_mmlu | ---
pretty_name: Evaluation run of itsliupeng/llama2_70b_mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_70b_mmlu](https://huggingface.co/itsliupeng/llama2_70b_mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_70b_mmlu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T15:24:45.322816](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_70b_mmlu/blob/main/results_2023-12-29T15-24-45.322816.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7152638190052161,\n\
\ \"acc_stderr\": 0.02952331074524934,\n \"acc_norm\": 0.7204152549719242,\n\
\ \"acc_norm_stderr\": 0.030082250835189752,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4914985403822716,\n\
\ \"mc2_stderr\": 0.0142870032875607\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.01388064457015621\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6779525990838479,\n\
\ \"acc_stderr\": 0.00466306082837678,\n \"acc_norm\": 0.8737303326030671,\n\
\ \"acc_norm_stderr\": 0.003314742077083317\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7547169811320755,\n \"acc_stderr\": 0.026480357179895695,\n\
\ \"acc_norm\": 0.7547169811320755,\n \"acc_norm_stderr\": 0.026480357179895695\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491223,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491223\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8451612903225807,\n \"acc_stderr\": 0.020579287326583227,\n \"\
acc_norm\": 0.8451612903225807,\n \"acc_norm_stderr\": 0.020579287326583227\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.03486731727419872,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"\
acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.735897435897436,\n \"acc_stderr\": 0.02235219373745328,\n \
\ \"acc_norm\": 0.735897435897436,\n \"acc_norm_stderr\": 0.02235219373745328\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9045871559633027,\n \"acc_stderr\": 0.012595899282335801,\n \"\
acc_norm\": 0.9045871559633027,\n \"acc_norm_stderr\": 0.012595899282335801\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891632,\n \
\ \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891632\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8251121076233184,\n\
\ \"acc_stderr\": 0.02549528462644497,\n \"acc_norm\": 0.8251121076233184,\n\
\ \"acc_norm_stderr\": 0.02549528462644497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005471,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005471\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.02826881219254063,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.02826881219254063\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.028311601441438596,\n\
\ \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.028311601441438596\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446901,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446901\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n\
\ \"acc_stderr\": 0.011622736692041285,\n \"acc_norm\": 0.879948914431673,\n\
\ \"acc_norm_stderr\": 0.011622736692041285\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8063583815028902,\n \"acc_stderr\": 0.021274230317515557,\n\
\ \"acc_norm\": 0.8063583815028902,\n \"acc_norm_stderr\": 0.021274230317515557\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4759776536312849,\n\
\ \"acc_stderr\": 0.016703190189300193,\n \"acc_norm\": 0.4759776536312849,\n\
\ \"acc_norm_stderr\": 0.016703190189300193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8167202572347267,\n\
\ \"acc_stderr\": 0.02197419884826582,\n \"acc_norm\": 0.8167202572347267,\n\
\ \"acc_norm_stderr\": 0.02197419884826582\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062065,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062065\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144363,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144363\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5775749674054759,\n\
\ \"acc_stderr\": 0.01261560047573493,\n \"acc_norm\": 0.5775749674054759,\n\
\ \"acc_norm_stderr\": 0.01261560047573493\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041503,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041503\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7875816993464052,\n \"acc_stderr\": 0.01654714863620315,\n \
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.01654714863620315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594176,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4914985403822716,\n\
\ \"mc2_stderr\": 0.0142870032875607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320708\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5299469294920395,\n \
\ \"acc_stderr\": 0.013747759685444703\n }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_70b_mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|arc:challenge|25_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|gsm8k|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hellaswag|10_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-24-45.322816.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T15-24-45.322816.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- '**/details_harness|winogrande|5_2023-12-29T15-24-45.322816.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T15-24-45.322816.parquet'
- config_name: results
data_files:
- split: 2023_12_29T15_24_45.322816
path:
- results_2023-12-29T15-24-45.322816.parquet
- split: latest
path:
- results_2023-12-29T15-24-45.322816.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_70b_mmlu
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_70b_mmlu](https://huggingface.co/itsliupeng/llama2_70b_mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_70b_mmlu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T15:24:45.322816](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_70b_mmlu/blob/main/results_2023-12-29T15-24-45.322816.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7152638190052161,
"acc_stderr": 0.02952331074524934,
"acc_norm": 0.7204152549719242,
"acc_norm_stderr": 0.030082250835189752,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4914985403822716,
"mc2_stderr": 0.0142870032875607
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.01388064457015621
},
"harness|hellaswag|10": {
"acc": 0.6779525990838479,
"acc_stderr": 0.00466306082837678,
"acc_norm": 0.8737303326030671,
"acc_norm_stderr": 0.003314742077083317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7547169811320755,
"acc_stderr": 0.026480357179895695,
"acc_norm": 0.7547169811320755,
"acc_norm_stderr": 0.026480357179895695
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491223,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491223
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.735897435897436,
"acc_stderr": 0.02235219373745328,
"acc_norm": 0.735897435897436,
"acc_norm_stderr": 0.02235219373745328
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9045871559633027,
"acc_stderr": 0.012595899282335801,
"acc_norm": 0.9045871559633027,
"acc_norm_stderr": 0.012595899282335801
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.017676679991891632,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.017676679991891632
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8251121076233184,
"acc_stderr": 0.02549528462644497,
"acc_norm": 0.8251121076233184,
"acc_norm_stderr": 0.02549528462644497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005471,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.02826881219254063,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.02826881219254063
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.028311601441438596,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.028311601441438596
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446901,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446901
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041285,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041285
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8063583815028902,
"acc_stderr": 0.021274230317515557,
"acc_norm": 0.8063583815028902,
"acc_norm_stderr": 0.021274230317515557
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4759776536312849,
"acc_stderr": 0.016703190189300193,
"acc_norm": 0.4759776536312849,
"acc_norm_stderr": 0.016703190189300193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8167202572347267,
"acc_stderr": 0.02197419884826582,
"acc_norm": 0.8167202572347267,
"acc_norm_stderr": 0.02197419884826582
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062065,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062065
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144363,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144363
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5775749674054759,
"acc_stderr": 0.01261560047573493,
"acc_norm": 0.5775749674054759,
"acc_norm_stderr": 0.01261560047573493
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041503,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041503
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.01654714863620315,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.01654714863620315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594176,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155754,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155754
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4914985403822716,
"mc2_stderr": 0.0142870032875607
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320708
},
"harness|gsm8k|5": {
"acc": 0.5299469294920395,
"acc_stderr": 0.013747759685444703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pawan2411/kdf_dev1 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: relation
dtype: string
splits:
- name: train
num_bytes: 1342651.1407338597
num_examples: 4301
- name: test
num_bytes: 1560.8592661402695
num_examples: 5
download_size: 640968
dataset_size: 1344212.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_0-hero__Matter-0.2-32B | ---
pretty_name: Evaluation run of 0-hero/Matter-0.2-32B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.2-32B](https://huggingface.co/0-hero/Matter-0.2-32B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.2-32B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-16T00:58:41.781770](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.2-32B/blob/main/results_2024-04-16T00-58-41.781770.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7375729303660368,\n\
\ \"acc_stderr\": 0.02929716865314781,\n \"acc_norm\": 0.7431817176816673,\n\
\ \"acc_norm_stderr\": 0.029851716626929974,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5251892212664137,\n\
\ \"mc2_stderr\": 0.014410220318581194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472442,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168478\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6478789085839474,\n\
\ \"acc_stderr\": 0.004766553336917496,\n \"acc_norm\": 0.842162915753834,\n\
\ \"acc_norm_stderr\": 0.0036384306206139333\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n\
\ \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.028919802956134902,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.028919802956134902\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424045,\n\
\ \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6322751322751323,\n \"acc_stderr\": 0.024833839825562413,\n \"\
acc_norm\": 0.6322751322751323,\n \"acc_norm_stderr\": 0.024833839825562413\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.033661244890514495,\n\
\ \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.033661244890514495\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476455,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476455\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.02056753956724681,\n \
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.02056753956724681\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235083,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235083\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"\
acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481655,\n \"\
acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481655\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6944444444444444,\n \"acc_stderr\": 0.03141554629402544,\n \"\
acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03141554629402544\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.02087111845555211,\n \"\
acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.02087111845555211\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9529914529914529,\n\
\ \"acc_stderr\": 0.01386612005859485,\n \"acc_norm\": 0.9529914529914529,\n\
\ \"acc_norm_stderr\": 0.01386612005859485\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826373,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826373\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8863346104725415,\n\
\ \"acc_stderr\": 0.011350359050566026,\n \"acc_norm\": 0.8863346104725415,\n\
\ \"acc_norm_stderr\": 0.011350359050566026\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.02174251983527628,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.02174251983527628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.582122905027933,\n\
\ \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.582122905027933,\n\
\ \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02229285828456807,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02229285828456807\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8167202572347267,\n\
\ \"acc_stderr\": 0.021974198848265826,\n \"acc_norm\": 0.8167202572347267,\n\
\ \"acc_norm_stderr\": 0.021974198848265826\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284073,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284073\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5867014341590613,\n\
\ \"acc_stderr\": 0.012576779494860083,\n \"acc_norm\": 0.5867014341590613,\n\
\ \"acc_norm_stderr\": 0.012576779494860083\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.024060599423487424,\n\
\ \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.024060599423487424\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5251892212664137,\n\
\ \"mc2_stderr\": 0.014410220318581194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068677\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5602729340409401,\n \
\ \"acc_stderr\": 0.013672052434471576\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.2-32B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|arc:challenge|25_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|gsm8k|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hellaswag|10_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-58-41.781770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-16T00-58-41.781770.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- '**/details_harness|winogrande|5_2024-04-16T00-58-41.781770.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-16T00-58-41.781770.parquet'
- config_name: results
data_files:
- split: 2024_04_16T00_58_41.781770
path:
- results_2024-04-16T00-58-41.781770.parquet
- split: latest
path:
- results_2024-04-16T00-58-41.781770.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.2-32B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.2-32B](https://huggingface.co/0-hero/Matter-0.2-32B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.2-32B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-16T00:58:41.781770](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.2-32B/blob/main/results_2024-04-16T00-58-41.781770.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7375729303660368,
"acc_stderr": 0.02929716865314781,
"acc_norm": 0.7431817176816673,
"acc_norm_stderr": 0.029851716626929974,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5251892212664137,
"mc2_stderr": 0.014410220318581194
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472442,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168478
},
"harness|hellaswag|10": {
"acc": 0.6478789085839474,
"acc_stderr": 0.004766553336917496,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.0036384306206139333
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.028919802956134902,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.028919802956134902
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424045,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6322751322751323,
"acc_stderr": 0.024833839825562413,
"acc_norm": 0.6322751322751323,
"acc_norm_stderr": 0.024833839825562413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.033661244890514495,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.033661244890514495
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476455,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476455
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.02056753956724681,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.02056753956724681
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235083,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235083
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481655,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481655
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.02087111845555211,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.02087111845555211
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9529914529914529,
"acc_stderr": 0.01386612005859485,
"acc_norm": 0.9529914529914529,
"acc_norm_stderr": 0.01386612005859485
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826373,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826373
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8863346104725415,
"acc_stderr": 0.011350359050566026,
"acc_norm": 0.8863346104725415,
"acc_norm_stderr": 0.011350359050566026
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.02174251983527628,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.02174251983527628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.582122905027933,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.582122905027933,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02229285828456807,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02229285828456807
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8167202572347267,
"acc_stderr": 0.021974198848265826,
"acc_norm": 0.8167202572347267,
"acc_norm_stderr": 0.021974198848265826
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.02088869041409387,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.02088869041409387
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284073,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284073
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5867014341590613,
"acc_stderr": 0.012576779494860083,
"acc_norm": 0.5867014341590613,
"acc_norm_stderr": 0.012576779494860083
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.024060599423487424,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.024060599423487424
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5251892212664137,
"mc2_stderr": 0.014410220318581194
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068677
},
"harness|gsm8k|5": {
"acc": 0.5602729340409401,
"acc_stderr": 0.013672052434471576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LeoCordoba/CC-NEWS-ES | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- es
license:
- mit
multilinguality:
- monolingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
source_datasets:
- cc-news
task_categories:
- summarization
- text-generation
task_ids: []
tags:
- conditional-text-generation
---
# Dataset Card for CC-NEWS-ES
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [CC-NEWS-ES dataset repository](https://huggingface.co/datasets/LeoCordoba/CC-NEWS-ES)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [Leonardo Ignacio Córdoba](https://www.linkedin.com/in/leonardo-ignacio-c%C3%B3rdoba/)
### Dataset Summary
CC-NEWS-ES is a Spanish-language dataset of news. The corpus was generated by extracting the Spanish articles from CC-NEWS (news index of Common Crawl) of 2019. For doing that FastText model was used for language prediction.
It contains a total of 7,473,286 texts and 1,812,009,283 words distributed as follows:
|domain | texts | words |
|:----|-----------------:|-----------------:|
| ar | 532703 | 1.45127e+08 |
| bo | 29557 | 7.28996e+06 |
| br | 107 | 14207 |
| cl | 116661 | 3.34633e+07 |
| co | 78662 | 1.92649e+07 |
| com | 3650950 | 8.44094e+08 |
| cr | 16542 | 3.82075e+06 |
| es |1838790 | 4.82943e+08 |
| gt | 4833 | 838121 |
| hn | 36559 | 5.49933e+06 |
| mx | 724908 | 1.62198e+08 |
| ni | 40643 | 1.08501e+07 |
| pa | 18447 | 4.34724e+06 |
| pe | 230962 | 3.52123e+07 |
| pr | 7756 | 1.6633e+06 |
| py | 30651 | 2.08077e+07 |
| sv | 454 | 353145 |
| uy | 80948 | 2.72562e+07 |
| ve | 33148 | 6.96578e+06 |
### Supported Tasks and Leaderboards
TODO
-
### Languages
The text is in Spanish. The BCP-47 code for Spanish is es.
## Dataset Structure
### Data Instances
Each data instance contains the following features: ...
- country: top level domain, usually refers to a country (except in the case of .com).
- text: body of the news
- id: internal id
An example from CC-NEWS-ES looks like the following:
```
{'country': 'py',
'text': '“La que asumió es una mujer que está en línea de sucesión. La policía, ni los militares están en el Palacio, lo que ella dijo fue que no se podía seguir reprimiendo al pueblo", manifestó este jueves el senador colorado, Enrique Riera, sobre la asunción presidencial en Bolivia de la senadora opositora, Jeanine Áñez,Riera agregó que Evo Morales el que "escapó y abandonó" a su pueblo al ir como asilado a México. En ese sentido, dijo que irónicamente, el expresidente boliviano no eligió como destino a Venezuela, Nicaragua ni a Cuba.Sostuvo que nos de debe utilizar a las instituciones democráticas y republicanas para llegar al poder, cambiando Constituciones y prorrogando mandatos una y otra vez. “El amigo Morales no respetó absolutamente nada”, subrayó.Por otra parte, el senador colorado mencionó que los fiscales y jueces bolivianos deberían tener el "coraje" de investigar el origen de la riqueza de Morales.Habló también sobre la situación en Venezuela y mencionó que Nicolás Maduro no cae, porque "toda la FFAA está contaminada de narcotráfico". El hombre cuenta con orden de prisión en su país por los ilícitos de Tráfico de Drogas y Asociación Criminal, según el Consejo Nacional de Justicia del Brasil.La agente fiscal Liliana Denice Duarte, titular de la Unidad Fiscal Nº 1 de Presidente Franco, requirió la expulsión del extranjero y la jueza Carina Frutos Recalde, mediante Auto Interlocutorio (A.I.) N° 2.153, dio curso favorable al pedido del Ministerio Público. Esto considerando la alta expectativa de pena que tiene el supuesto delincuente en su país.La detención ...',
'id': 7328086}
Note: the text is shortened for simplicity.
```
### Data Fields
- ...
- ...
### Data Splits
...
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
Common Crawl: https://commoncrawl.org/
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
...
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
This dataset is maintained by [Leonardo Ignacio Córdoba](https://www.linkedin.com/in/leonardo-ignacio-c%C3%B3rdoba/) and was built with the help of [María Gaska](https://www.linkedin.com/in/mfgaska/).
### Licensing Information
[N/A]
### Citation Information
TODO
### Contributions
[N/A] |
liuyanchen1015/MULTI_VALUE_wnli_it_dobj | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 718
num_examples: 4
- name: test
num_bytes: 1668
num_examples: 6
- name: train
num_bytes: 7943
num_examples: 42
download_size: 12669
dataset_size: 10329
---
# Dataset Card for "MULTI_VALUE_wnli_it_dobj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baaghi124/OpenOrca-Clean | ---
license: mit
language:
- en
---
This dataset represents the refined version derived from the original [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) dataset. |
mbk0asis/test_data | ---
license: openrail
---
|
chrisgru/commonsense-dialogues2 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9152294
num_examples: 20176
- name: test
num_bytes: 941561
num_examples: 2158
- name: validation
num_bytes: 962952
num_examples: 2157
download_size: 6212665
dataset_size: 11056807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "commonsense-dialogues2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dbuos/oasst_top1_2023-08-25_languages | ---
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- conversational
dataset_info:
features:
- name: text
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 23211220
num_examples: 12947
download_size: 13220375
dataset_size: 23211220
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# OpenAssistant TOP-1 Conversation Threads
- [Guanacco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) style export of the best conversation threads from the [open-assistant.io](https://open-assistant.io/) database
- exported August 25, 2023
- jsonl files with [chatml](https://github.com/openai/openai-python/blob/main/chatml.md) formatted conversations
- train: 12,947 samples
- With a column indicating the language used. |
albertvillanova/medmnist-v2 | ---
language: en
license: cc-by-4.0
multilinguality:
- monolingual
pretty_name: MedMNIST v2
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
- multi-label-image-classification
paperswithcode_id: medmnist-v2
tags:
- medical
---
# Dataset Card for MedMNIST v2
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://medmnist.com/
- **Repository:** https://github.com/MedMNIST/MedMNIST
- **Paper:** [MedMNIST v2 -- A large-scale lightweight benchmark for 2D and 3D biomedical image classification](https://arxiv.org/abs/2110.14795)
- **Leaderboard:**
- **Point of Contact:** [Bingbing Ni](mailto:nibingbing@sjtu.edu.cn)
### Dataset Summary
We introduce MedMNIST v2, a large-scale MNIST-like collection of standardized biomedical images, including 12 datasets for 2D and 6 datasets for 3D. All images are pre-processed into 28 x 28 (2D) or 28 x 28 x 28 (3D) with the corresponding classification labels, so that no background knowledge is required for users. Covering primary data modalities in biomedical images, MedMNIST v2 is designed to perform classification on lightweight 2D and 3D images with various data scales (from 100 to 100,000) and diverse tasks (binary/multi-class, ordinal regression and multi-label). The resulting dataset, consisting of 708,069 2D images and 9,998 3D images in total, could support numerous research / educational purposes in biomedical image analysis, computer vision and machine learning. We benchmark several baseline methods on MedMNIST v2, including 2D / 3D neural networks and open-source / commercial AutoML tools.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English (`en`).
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is licensed under [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) (CC BY 4.0).
Each subset keeps the same license as that of the source dataset. Please also cite the corresponding paper of source data if you use any subset of MedMNIST.
### Citation Information
If you find this project useful, please cite both v1 and v2 papers:
```
@article{medmnistv2,
title={MedMNIST v2-A large-scale lightweight benchmark for 2D and 3D biomedical image classification},
author={Yang, Jiancheng and Shi, Rui and Wei, Donglai and Liu, Zequan and Zhao, Lin and Ke, Bilian and Pfister, Hanspeter and Ni, Bingbing},
journal={Scientific Data},
volume={10},
number={1},
pages={41},
year={2023},
publisher={Nature Publishing Group UK London}
}
@inproceedings{medmnistv1,
title={MedMNIST Classification Decathlon: A Lightweight AutoML Benchmark for Medical Image Analysis},
author={Yang, Jiancheng and Shi, Rui and Ni, Bingbing},
booktitle={IEEE 18th International Symposium on Biomedical Imaging (ISBI)},
pages={191--195},
year={2021}
}
```
Please also cite the corresponding paper(s) of source data if you use any subset of MedMNIST as per the description on the [project website](https://medmnist.com/).
### Contributions
Thanks to [@albertvillanova](https://huggingface.co/albertvillanova) for adding this dataset.
|
sorenmulli/nordjylland-news-summarization-subset | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: text_len
dtype: int64
- name: summary_len
dtype: int64
- name: ind
dtype: int64
splits:
- name: train
num_bytes: 243846
num_examples: 300
download_size: 162666
dataset_size: 243846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# [WIP] Dataset Card for "nordjylland-news-summarization-subset"
*Please note that this dataset and dataset card both are works in progress. For now refer to the related [thesis](https://sorenmulli.github.io/thesis/thesis.pdf) for all details*
|
Hittu99/bankcommunicationmask | ---
license: unknown
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/a77515c4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1331
dataset_size: 178
---
# Dataset Card for "a77515c4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cognitive-Lab/Kannada_Bilingual_Instruct | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: translation
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1217971903
num_examples: 905120
download_size: 535802559
dataset_size: 1217971903
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- kn
--- |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-preference-64-nsample-12_filter_gold_thr_0.3_self_160m | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_1
num_bytes: 44144780
num_examples: 18929
- name: epoch_0
num_bytes: 43597508
num_examples: 18929
download_size: 92852915
dataset_size: 87742288
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
---
|
Pa-satith/Mahidol_Deeplearning_Project_Classify_Friend_From_PopStar | ---
license: apache-2.0
---
|
Mtjay/myDataSet | ---
license: other
license_name: my-license
license_link: LICENSE
---
|
malaysia-ai/crawl-cambridge-english-malaysian | ---
language: ms
---
# Crawl cambridge English-Malaysian
Crawled from https://dictionary.cambridge.org/browse/english-malaysian/, 25171 english-malaysian words.
Notebooks to gather the dataset at https://github.com/huseinzol05/malay-dataset/tree/master/dictionary/cambridge |
zolak/twitter_dataset_79_1713082646 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3178168
num_examples: 7883
download_size: 1558513
dataset_size: 3178168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/VALUE_cola_negative_concord | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2852
num_examples: 32
- name: test
num_bytes: 3887
num_examples: 43
- name: train
num_bytes: 20147
num_examples: 258
download_size: 18300
dataset_size: 26886
---
# Dataset Card for "VALUE_cola_negative_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juletxara/xquad_xtreme | ---
pretty_name: XQuAD-XTREME
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
- es
- de
- el
- hi
- th
- ru
- tr
- ar
- vi
- zh
- ro
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- extended|squad
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: xquad
---
# Dataset Card for XQuAD-XTREME
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/deepmind/xquad](https://github.com/deepmind/xquad)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 139.53 MB
- **Size of the generated dataset:** 18.09 MB
- **Total amount of disk used:** 157.62 MB
### Dataset Summary
XQuAD (Cross-lingual Question Answering Dataset) is a benchmark dataset for evaluating cross-lingual question answering
performance. The dataset consists of a subset of 240 paragraphs and 1190 question-answer pairs from the development set
of SQuAD v1.1 (Rajpurkar et al., 2016) together with their professional translations into ten language: Spanish, German,
Greek, Russian, Turkish, Arabic, Vietnamese, Thai, Chinese, Hindi and Romanian. Consequently, the dataset is entirely parallel across 12 languages.
We also include "translate-train", "translate-dev", and "translate-test"
splits for each non-English language from XTREME (Hu et al., 2020). These can be used to run XQuAD in the "translate-train" or "translate-test" settings. https://proceedings.mlr.press/v119/hu20b/hu20b.pdf
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### ar
- **Size of downloaded dataset files:** 12.68 MB
- **Size of the generated dataset:** 1.64 MB
- **Total amount of disk used:** 14.33 MB
An example of 'test' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [527],
"text": ["136"]
},
"context": "\"Die Verteidigung der Panthers gab nur 308 Punkte ab und belegte den sechsten Platz in der Liga, während sie die NFL mit 24 Inte...",
"id": "56beb4343aeaaa14008c925c",
"question": "Wie viele Sacks erzielte Jared Allen in seiner Karriere?"
}
```
#### de
- **Size of downloaded dataset files:** 12.68 MB
- **Size of the generated dataset:** 1.23 MB
- **Total amount of disk used:** 13.91 MB
An example of 'test' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [527],
"text": ["136"]
},
"context": "\"Die Verteidigung der Panthers gab nur 308 Punkte ab und belegte den sechsten Platz in der Liga, während sie die NFL mit 24 Inte...",
"id": "56beb4343aeaaa14008c925c",
"question": "Wie viele Sacks erzielte Jared Allen in seiner Karriere?"
}
```
#### el
- **Size of downloaded dataset files:** 12.68 MB
- **Size of the generated dataset:** 2.11 MB
- **Total amount of disk used:** 14.79 MB
An example of 'test' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [527],
"text": ["136"]
},
"context": "\"Die Verteidigung der Panthers gab nur 308 Punkte ab und belegte den sechsten Platz in der Liga, während sie die NFL mit 24 Inte...",
"id": "56beb4343aeaaa14008c925c",
"question": "Wie viele Sacks erzielte Jared Allen in seiner Karriere?"
}
```
#### en
- **Size of downloaded dataset files:** 12.68 MB
- **Size of the generated dataset:** 1.07 MB
- **Total amount of disk used:** 13.75 MB
An example of 'test' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [527],
"text": ["136"]
},
"context": "\"Die Verteidigung der Panthers gab nur 308 Punkte ab und belegte den sechsten Platz in der Liga, während sie die NFL mit 24 Inte...",
"id": "56beb4343aeaaa14008c925c",
"question": "Wie viele Sacks erzielte Jared Allen in seiner Karriere?"
}
```
#### es
- **Size of downloaded dataset files:** 12.68 MB
- **Size of the generated dataset:** 1.22 MB
- **Total amount of disk used:** 13.90 MB
An example of 'test' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [527],
"text": ["136"]
},
"context": "\"Die Verteidigung der Panthers gab nur 308 Punkte ab und belegte den sechsten Platz in der Liga, während sie die NFL mit 24 Inte...",
"id": "56beb4343aeaaa14008c925c",
"question": "Wie viele Sacks erzielte Jared Allen in seiner Karriere?"
}
```
### Data Fields
The data fields are the same among all splits.
#### ar
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
#### de
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
#### el
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
#### en
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
#### es
- `id`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | validation |
| -------- | ---------: |
| ar | 1190 |
| de | 1190 |
| el | 1190 |
| en | 1190 |
| es | 1190 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{Artetxe:etal:2019,
author = {Mikel Artetxe and Sebastian Ruder and Dani Yogatama},
title = {On the cross-lingual transferability of monolingual representations},
journal = {CoRR},
volume = {abs/1910.11856},
year = {2019},
archivePrefix = {arXiv},
eprint = {1910.11856}
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
Cohere/miracl-fr-corpus-22-12 | ---
annotations_creators:
- expert-generated
language:
- fr
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (fr) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-fr-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-fr-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-fr-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-fr-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-fr-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
Siki-77/amazon | ---
license: apache-2.0
---
|
katylee/atco-code | ---
license: mit
---
|
giux78/small-test-ultrafeedback-ita | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 7293099
num_examples: 1000
- name: test_sft
num_bytes: 697688
num_examples: 100
- name: train_gen
num_bytes: 7293099
num_examples: 1000
- name: test_gen
num_bytes: 697688
num_examples: 100
download_size: 8546266
dataset_size: 15981574
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
louisbrulenaudet/code-procedures-civiles-execution | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code des procédures civiles d'exécution
source_datasets:
- original
pretty_name: Code des procédures civiles d'exécution
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code des procédures civiles d'exécution, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
pudim0/satoro | ---
license: cc-by-nc-4.0
---
|
LambdaTests/VQAv2Validation_ViT_L_14_A_T_C_D-PNP-FILTER_benchmarks_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 384
num_examples: 10
download_size: 0
dataset_size: 384
---
# Dataset Card for "VQAv2Validation_ViT_L_14_A_T_C_D-PNP-FILTER_benchmarks_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orgcatorg/almendron | ---
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: category
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 4589782
num_examples: 263
download_size: 2699295
dataset_size: 4589782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
silaseic/pokemon | ---
license: unknown
---
Dataset from
https://www.kaggle.com/datasets/rounakbanik/pokemon
|
JewelC/test | ---
license: cc-by-nc-sa-4.0
---
|
yamanahlawat/fox | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1970242.0
num_examples: 6
download_size: 1970753
dataset_size: 1970242.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.